I have heard of some of these CPUs over the years.
An interesting one for me was the "Intel Xeon LV 1.6 D1 (Prestonia)":
The Prestonia core was basically the Pentium 4 Northwood with SMP (symmetric multiprocessing) and HyperThreading added as standard features. With the sub-$200 1.6GHz Xeon drawing a frugal 1.274v, overclockers generally couldn't take advantage of voltage headroom as most boards were voltage-locked. However, simply raising the FSB would net 2.6GHz.
For the more adventurous, three hard mods could yield a 100% overclock (or more!): the U-Wire mod which involved bridging two (1.5v) or three (1.6v) sets of socket pins, the BSEL mod to isolate or break CPU pins and raise the FSB limit to 200MHz, and the vDIMM mod to raise RAM voltage.
Those willing to push the limits of the technology could be rewarded with a 3.2GHz dual processor performance king for around $700 (CPUs, coolers, board, and RAM).
Even without the hard mods, this is a solid CPU with basic FSB raise considering the price.
Getting in the GPU game in the current environment is a huge undertaking. Arc was clearly a first generation product. Battlemage is far more viable, if you're going for a relatively cheap dGPU, there is no better option than B580/B670.
Trump tariffs will hit consoles, monitors, and laptops hardest — U.S. imports 66% or more from China
According to Bloomberg, the U.S. relies on imports from China for a majority of the following products by value: game consoles (86%, $6 billion), PC monitors (79%, $5 billion), smartphones (73%, $41 billion), lithium-ion batteries (70%, $16 billion), and laptops (66%, $32 billion).
Not to mention countries other than China (particularly ones that are involved in tech hardware manufacturing like Vietnam and Thailand) are also under tariffs.
Samsung 32" EMDX color e-paper display runs Tizen 8.0, offers 2560x1440 resolution, up to 200 days of battery life - CNX Software
Trump tariffs will hit consoles, monitors, and laptops hardest — U.S. imports 66% or more from China
In TED Talk, Google demos Android XR: The 'Tony Stark' glasses that translate languages in real time
Sounds like a solution in search of a problem.
Not to mention that the Android XR platform will be shut down pretty soon (this is Google after all).
In TED Talk, Google demos Android XR: The 'Tony Stark' glasses that translate languages in real time
There are numerous ways to measure AI throughput, making it difficult to compare chips. Google is using FP8 precision as its benchmark for the new TPU, but it's comparing it to some systems, like the El Capitan supercomputer, that don't support FP8 in hardware. So you should take its claim that Ironwood "pods" are 24 times faster than comparable segments of the world's most powerful supercomputer with a grain of salt.
This is not a grain of salt. This is premeditated lying.
Honestly the whole article reminds of this:
Can you configure the output for something other than a podcast discussion and have a more dry, technical lecture format?
I have always done copy/paste for password confirmations. "Yes" and "confirm" is new for me, I haven't heard of such inputs. 🤣🥃
I guess the next step would be solving the issue. Recognizing that you may need to take inspiration from other countries and also apply a more realistic risk management/tolerance approach (the matters we are discussions often include difficult choices).
I am talking about hypotheticals. Just like saying Apple will tolerate 2% profit margin vs their current ~35%.
It's a thought experiment.
To be fair Optane was an actual real-world (i.e. a commoner with money could purchase it) example of a product that offered a good middle ground between SSDs and DRAM using a novel technology. If money was not an issue, you would be better off running Optane than even the top SSD of the time.
Why do you say that? Note I did not downvote you, I almost never downvote in this community (I am a mod after all).
I don't watch YT videos around mainstream technology (niche topics is different), so I only read GN's articles. It's relatively balanced with solid analysis considering the target audience.
What sort of baseline in terms of posture do you have in mind? Let's say LTT is shill positivity and GN is a fixation on negativity. What would sort of balanced approach do you think would work?
Keep in mind I am not a GN fanboy and I only read their articles when they are posted here. But that being said I would rather choose "commercial negativity" that represents my interests than "shill positivity".
And in defence of GN, they've actually done a lot of good for the global PC enthusiast community. GN taking up a story forces OEMs and semiconductor designers to react and they are big enough that they don't necessarily need review samples (so they have some leverage).
How is this a bad thing?
This is true. But it also ignores price dynamics.
One of the first GPUs that I "bought" (convinced my father to pay for an upgrade) was the GeForce 6600 for ~$250 or so (maybe $275 max) in 2004. This is the true price, not American-style list price. We bought it for that price (in local currency) at a computer store. I believe US true prices were (much?) lower that $275 at that time, but I could be wrong.
$275 in 2004 is around $470 in 2025. You are not getting a Nvidia 6600 class card for $470 (all in) from AMD or Nvidia. The closest would be the Intel B580 which goes for around $340 (true price) where I live. But I would argue the B580 is not comparable to what the 6600 was in 2004. And the 6600 was broadly available in 2004 (at relatively competitive prices) even though I did not live in the "western world".
And keep in mind that I don't remember the exact price of the 6600 that we bought in 2004. My memory tells me that it was around $250 which would be $420 is current dollars (solid price difference to the $470 mentioned earlier).
I wouldn't go as far as a 100% increase in price, but they will definitely pass on the cost.
I am just curious about their PR tactics as they are based in the US. Will they simply not say anything (while not replying to journalists' emails on this topic) and raise prices? Or will they try and come up with some PR copytext?
I meant that in theory a certain % of the components could be produced in the US. Say the CPU could be manufactured at a TSMC plant in Arizona and the semiconductor validation/packaging done in one of the recently opened plants for these sort of services (the previous US administration did award some grants specifically around validation/packaging).
Of course, many of the components cannot realistically be produced in the US. I am just going through a "what if" scenario.
While having every single component of the iPhone being manufactured in the US (something along the lines of RCA pre-1970) is indeed pure fantasy. You could hypothetically assemble the iPhone in the US, with many of the components being locally produced.
But for that, you would need a competitive smartphone market (i.e. no 35% margin for Apple, they would have to deal with 2%-3% profit margin, 5% tops) and Americans consumers would have to be exposed to the true cost of the device. No annual upgrades through your carrier. You pay $1,500+ for a baseline iPhone and you expect it to last for ~5 years with occasional repair.
I am curious to see if Dell and HP will make any statements on this issue; with them being based in the US.
An executive at an Asian NAND module manufacturer said they were taking a similar approach to Micron to tell U.S. customers they had to figure out the tariffs themselves. "If they don't want to bear the taxes, we cannot ship the products. We cannot be held accountable for the decisions made by your government," the person said, declining to be named as they were not permitted to speak to the media. "With this kind of tax rate, no company can generously say, 'I'll take on the burden'."
This is a pretty frank and to the point statement on the issue. Americans do need to figure out what's going on with their political and government system.
Kioxia optical-based PCIe 5.0 SSDs support cable lengths up to 40 meters — now they just need optical-ready data centers
Razer halts laptop sales to US consumers — this response to US tariffs could become commonplace
New 'DRAM+' memory designed to provide DRAM performance with SSD-like storage capabilities, uses FeRAM tech
Money shot:
Zooming back out, we think the overall picture is clear. NVIDIA has downsized essentially all of its gaming GPUs in terms of relative configuration compared to each generation’s flagship. All of the lines go down. The chart from earlier had a lot of words to say one thing: Line go down = bad. We don’t want the line to go down. We want the line to stay the same or go up.
The 80 class is now in line with former 70 class GPUs and the 70 Ti/Super class is now in line with former 60 Ti class territory. The last 60 class card was configured like a 50-class of yore.
For Nvidia gaming is now just an annoying side project that they have to keep up with to maintain their Plan B scenario. They are a highly sophisticated organization, somewhere in their internal analysis there is a scenario that covers a significant decline in "AI" GPU revenues.
X leaker Jukanlosreve claims in his recent post that Qualcomm and Samsung are separately developing custom RISC-V cores to reduce their ARM licensing costs.
We need more detailed confirmation on this, but it is not at all surprising. It's that Masayoshi Son magic touch that gave us Adam Neumann and WeWork.
Not a fan of MKBHD, but I could totally see Pixels coming out on top.