diff options
author | Gabe Black <gabeblack@google.com> | 2014-04-05 03:54:30 -0700 |
---|---|---|
committer | Marc Jones <marc.jones@se-eng.com> | 2014-12-15 19:57:49 +0100 |
commit | e5b21274bd3648a6797a2415e5ad654ca4609275 (patch) | |
tree | 2314fb62e508d33d54a7142bdde8e096d1c8c5da /src/soc/nvidia/tegra124 | |
parent | c04d3dd7b3d026be03d2acbba1585d1824b5ea82 (diff) | |
download | coreboot-e5b21274bd3648a6797a2415e5ad654ca4609275.tar.xz |
tegra124: A couple clock fixes.
This fixes two problems with the clock configuration on tegra124. First, the
macro which set up the i2c clocks tried to account for the fact that the i2c
divisor's lsb represents 1.0 where it normally represents 0.5 by multiplying
the target frequency by 2. That doesn't work, unfortunately, because the
divisor is actually n + 1, and what n + 1 means depends on where the one's
place is in the divisor.
Also, when calculating the divisor, the standard C division operator uses
truncation to deal any remainder which tends to make the divisor smaller. That
has the effect of making the output frequency higher than what was requested.
Since it's usually safer to undershoot a frequency than overshoot it, this
change makes those divisions round up instead.
Finally, the hand tuned temporary UART clock configuration was adjusted so
that it still ends up with the same divisor. Without that, very early output
from the bootblock is garbled, specifically the coreboot welcome banner,
build timestamp, etc.
BUG=chrome-os-partner:27220
TEST=Built and booted on nyan. Used a logic analyzer to verify that the TPM
i2c bus ran at 400KHz instead of 660KHz, and that the divisor was the expected
value. Measured boot time with and without EFS and verified that there was no
change. Spot checked the output for errors and verified that none of the
bootblock output was garbled.
BRANCH=None
Had to add the stdlib.h from 89ed6c that hadn't been merged correctly.
Original-Change-Id: I7e948c361ed4bf58c608627d32f2e3424faea1fb
Original-Signed-off-by: Gabe Black <gabeblack@google.com>
Original-Reviewed-on: https://chromium-review.googlesource.com/193362
Original-Reviewed-by: Julius Werner <jwerner@chromium.org>
Original-Commit-Queue: Gabe Black <gabeblack@chromium.org>
Original-Tested-by: Gabe Black <gabeblack@chromium.org>
(cherry picked from commit 164f7010a47d3bbdbc8bb572106140ae186f3807)
Signed-off-by: Marc Jones <marc.jones@se-eng.com>
Change-Id: I317b66eda929c0e5a5832adca267b8b54c6aae34
Reviewed-on: http://review.coreboot.org/7736
Tested-by: build bot (Jenkins)
Reviewed-by: Stefan Reinauer <stefan.reinauer@coreboot.org>
Diffstat (limited to 'src/soc/nvidia/tegra124')
-rw-r--r-- | src/soc/nvidia/tegra124/clock.c | 4 | ||||
-rw-r--r-- | src/soc/nvidia/tegra124/include/soc/clock.h | 13 |
2 files changed, 11 insertions, 6 deletions
diff --git a/src/soc/nvidia/tegra124/clock.c b/src/soc/nvidia/tegra124/clock.c index a81cf5f05a..4675b7b124 100644 --- a/src/soc/nvidia/tegra124/clock.c +++ b/src/soc/nvidia/tegra124/clock.c @@ -298,12 +298,12 @@ static void graphics_pll(void) * Will later move it to PLLP in clock_config(). The divisor must be very small * to accomodate 12KHz OSCs, so we override the 16.0 UART divider with the 15.1 * CLK_SOURCE divider to get more precision. (This might still not be enough for - * some OSCs... if you use 13KHz, be prepared to have a bad time.) The 1800 has + * some OSCs... if you use 13KHz, be prepared to have a bad time.) The 1900 has * been determined through trial and error (must lead to div 13 at 24MHz). */ void clock_early_uart(void) { write32(CLK_M << CLK_SOURCE_SHIFT | CLK_UART_DIV_OVERRIDE | - CLK_DIVIDER(TEGRA_CLK_M_KHZ, 1800), &clk_rst->clk_src_uarta); + CLK_DIVIDER(TEGRA_CLK_M_KHZ, 1900), &clk_rst->clk_src_uarta); setbits_le32(&clk_rst->clk_out_enb_l, CLK_L_UARTA); udelay(2); clrbits_le32(&clk_rst->rst_dev_l, CLK_L_UARTA); diff --git a/src/soc/nvidia/tegra124/include/soc/clock.h b/src/soc/nvidia/tegra124/include/soc/clock.h index ae3e9acfdd..7038b87b7e 100644 --- a/src/soc/nvidia/tegra124/include/soc/clock.h +++ b/src/soc/nvidia/tegra124/include/soc/clock.h @@ -19,6 +19,7 @@ #define __SOC_NVIDIA_TEGRA124_CLOCK_H__ #include <stdint.h> +#include <stdlib.h> enum { CLK_L_CPU = 0x1 << 0, @@ -174,6 +175,7 @@ enum { #define CLOCK_PLL_STABLE_DELAY_US 300 #define IO_STABILIZATION_DELAY (2) + /* Calculate clock fractional divider value from ref and target frequencies. * This is for a U7.1 format. This is not well written up in the book and * there have been some questions about this macro, so here we go. @@ -195,7 +197,7 @@ enum { * and voila, upper 7 bits are (ref/freq-1), and lowest bit is h. Since you * will assign this to a u8, it gets nicely truncated for you. */ -#define CLK_DIVIDER(REF, FREQ) ((((REF) * 2) / (FREQ)) - 2) +#define CLK_DIVIDER(REF, FREQ) (div_round_up(((REF) * 2), (FREQ)) - 2) /* Calculate clock frequency value from reference and clock divider value * The discussion in the book is pretty lacking. @@ -232,11 +234,14 @@ enum { * We can deal with those here and make it easier to select what the actual * bus frequency will be. The 0x19 value is the default divisor in the * clk_divisor register in the controller, and 8 is just a magic number in the - * documentation. Multiplying by 2 compensates for the different format of the - * divisor. + * documentation. */ #define clock_configure_i2c_scl_freq(device, src, freq) \ - clock_configure_source(device, src, (freq) * (0x19 + 1) * 8 * 2) + clrsetbits_le32(&clk_rst->clk_src_##device, \ + CLK_SOURCE_MASK | CLK_DIVISOR_MASK, \ + src << CLK_SOURCE_SHIFT | \ + (div_round_up((TEGRA_##src##_KHZ), \ + ((freq) * (0x19 + 1) * 8)) - 1)) enum clock_source { /* Careful: Not true for all sources, always check TRM! */ PLLP = 0, |