Poll shows 84% of PC users unwilling to pay extra for AI-enhanced hardware
Poll shows 84% of PC users unwilling to pay extra for AI-enhanced hardware
Just a moment...
Poll shows 84% of PC users unwilling to pay extra for AI-enhanced hardware
Just a moment...
The dedicated TPM chip is already being used for side-channel attacks. A new processor running arbitrary code would be a black hat's wet dream.
It will be.
IoT devices are already getting owned at staggering rates. Adding a learning model that currently cannot be secured is absolutely going to happen, and going to cause a whole new large batch of breaches.
The “s” in IoT stands for “security”
Do you have an article on that handy? I like reading about side channel and timing attacks.
I would pay for AI-enhanced hardware...but I haven't yet seen anything that AI is enhancing, just an emerging product being tacked on to everything they can for an added premium.
In the 2010s, it was cramming a phone app and wifi into things to try to justify the higher price, while also spying on users in new ways. The device may even a screen for basically no reason.
In the 2020s, those same useless features now with a bit of software with a flashy name that removes even more control from the user, and allows the manufacturer to spy on even further the user.
It's like rgb all over again.
At least rgb didn't make a giant stock market bubble...
Anything AI actually enhanced would be advertising the enhancement not the AI part.
DLSS and XeSS (XMX) are AI and they're noticably better than non-hardware accelerated alternatives.
My Samsung A71 has had devil AI since day one. You know that feature where you can mostly use fingerprint unlock but then once a day or so it ask for the actual passcode for added security. My A71 AI has 100% success rate of picking the most inconvenient time to ask for the passcode instead of letting me do my thing.
Already had that Google thingy for years now. The USB/nvme device for image recognition. Can't remember what it's called now. Cost like $30.
Edit: Google coral TPU
I use it heavily at work nowadays. It would be nice to run it locally.
You don't need AI enhanced hardware for that, just normal ass hardware and you run AI software on it.
https://github.com/huggingface/candle
You can look into this, however it’s not what this discussion is about
I'm curious what you use it for at work.
I'm generally opposed to anything that involves buying new hardware. This isn't the 1980s. Computers are powerful as fuck. Stop making software that barely runs on them. If they can't make ai more efficient then fuck it. If they can't make game graphics good without a minimum of a $1000 gpu that produces as much heat as a space heater, maybe we need to go back to 2000s era 3d. There is absolutely no point in making graphics more photorealistic than maybe Skyrim. The route they're going is not sustainable.
The point of software like DLSS is to run stuff better on computers with worse specs than what you'd normally need to run a game as that quality. There's plenty of AI tech that can actually improve experiences and saying that Skyrim graphics are the absolute max we as humanity "need" or "should want" is a weird take ¯(ツ)_/¯
The quality of games has dropped a lot, they make them fast and as long as it can just about reach 60fps at 720p they release it. Hardware is insane these days, the games mostly look the same as they did 10 years ago (Skyrim never looked amazing for 2011. BF3, Crysis 2, Forza, Arkham City etc. came out then too), but the performance of them has dropped significantly.
I don't want DLSS and I refuse to buy a game that relies on upscaling to have any meaningful performance. Everything should be over 120fps at this point, way over. But people accept the shit and buy the games up anyway, so nothing is going to change.
The point is, we would rather have games looking like Skyrim with great performance vs '4K RTX real time raytracing ultra AI realistic graphics wow!' at 60fps.
We should have stopped with Mario 64. Everything else has been an abomination.
Only 7% say they would pay more, which to my mind is the percentage of respondents who have no idea what "AI" in its current bullshit context even is
I figure they're those "early adopters" who buy the New Thing! as soon as it comes out, whether they need it or not, whether it's garbage or not, because they want to be seen as on the cutting edge of technology.
I am generally unwilling to pay extra for features I don't need and didn't ask for.
raytracing is something I'd pay for even if unasked, assuming they meaningfully impact the quality and dont demand outlandish prices.
And they'd need to put it in unasked and cooperate with devs else it won't catch on quickly enough.
Remember Nvidia Ansel?
I was recently looking for a new laptop and I actively avoided laptops with AI features.
Look, me too, but, the average punter on the street just looks at AI new features and goes OK sure give it to me. Tell them about the dodgy shit that goes with AI and you'll probably get a shrug at most
The biggest surprise here is that as many as 16% are willing to pay more...
Acktually it's 7% that would pay, with the remainder 'unsure'
I mean, if framegen and supersampling solutions become so good on those chips that regular versions can't compare I guess I would get the AI version. I wouldn't pay extra compared to current pricing though
What does AI enhanced hardware mean? Because I bought an Nvidia RTX card pretty much just for the AI enhanced DLSS, and I’d do it again.
When they start calling everything AI, soon enough it loses all meaning. They're gonna have to start marketing things as AI-z, AI 2, iAI, AIA, AI 360, AyyyAye, etc. Got their work cut out for em, that's for sure.
Instead of Nvidia knowing some of your habits, they will know most of your habits. $$$.
Just saying, I’d welcome some competition from other players in the industry. AI-boosted upscaling is a great use of the hardware, as long as it happens on your own hardware only.
Who in the heck are the 16%
I would pay for Weird-Al enhanced PC hardware.
Those Weird Al fans will be very disappointed
I'm interested in hardware that can better run local models. Right now the best bet is a GPU, but I'd be interested in a laptop with dedicated chips for AI that would work with pytorch. I'm a novice but I know it takes forever on my current laptop.
Not interested in running copilot better though.
Maybe people doing AI development who want the option of running local models.
But baking AI into all consumer hardware is dumb. Very few want it. saas AI is a thing. To the degree saas AI doesn't offer the privacy of local AI, networked local AI on devices you don't fully control offers even less. So it makes no sense for people who value convenience. It offers no value for people who want privacy. It only offers value to people doing software development who need more playground options, and I can go buy a graphics card myself thank you very much.
I would if the hardware was powerful enough to do interesting or useful things, and there was software that did interesting or useful things. Like, I'd rather run an AI model to remove backgrounds from images or upscale locally, than to send images to Adobe servers (this is just an example, I don't use Adobe products and don't know if this is what Adobe does). I'd also rather do OCR locally and quickly than send it to a server. Same with translations. There are a lot of use-cases for "AI" models.
Okay, but here me out. What if the OS got way worse, and then I told you that paying me for the AI feature would restore it to a near-baseline level of original performance? What then, eh?
I already moved to Linux. Windows is basically doing this already.
One word. Linux.
I don't think the poll question was well made... "would you like part away from your money for..." vaguely shakes hand in air "...ai?"
People is already paying for "ai" even before chatGPT came out to popularize things: DLSS
I would pay extra to be able to run open LLM's locally on Linux. I wouldn't pay for Microsoft's Copilot stuff that's shoehorned into every interface imaginable while also causing privacy and security issues. The context matters.
That's why NPU's are actually a good thing. The ability to run LLM local instead of sending everything to Microsoft/Open AI for data mining will be great.
I hate to be that guy, but do you REALLY think that on-device AI is going to prevent all your shit being sent to anyone who wants it, in the form of "diagnostic data" or "usage telemetry" or whatever weasel-worded bullshit in the terms of service?'
They'll just send the results for "quality assurance" instead of doing the math themselves and save a bundle on server hosting.
Pay more for a shitty chargpt clone in your operating system that can get exploited to hack your device. I see no flaw in this at all.
I'm willing to pay extra for software that isn't
My old ass GTX 1060 runs some of the open source language models. I imagine the more recent cards would handle them easily.
What’s the “AI” hardware supposed to do that any gamer with recent hardware can’t?
Run it faster.
A CPU can also compute graphics but you wait significant more time than using hardware accelerated graphics hardware.
You borked your link
I completely agree. There are some killer AI apps, but why should AI run on my OS? Recall is a complete disaster of a product and I hope it doesn't see the light of day, but I've no doubt that there's a place for AI on the PC.
Whatever application there is in AI at the OS level, it needs to be a trustless system that the user has complete control of. I'd be all for an Open source AI running at that level, but Microsoft is not going to do that because they want to ensure that they control your OS data.
Fuck, they won't upgrade to TPM for windows 11
Still havent turned mine on, don't want no surprises after a long day at work
I can't tell how good any of this stuff is because none of the language they're using to describe performance makes sense in comparison with running AI models on a GPU. How big a model can this stuff run, how does it compare to the graphics cards people use for AI now?
A big letdown for me is, except with some rare cases, those extra AI features useless outside of AI. Some NPUs are straight out DSPs, they could easily run OpenCL code, others are either designed to not be able to handle any normal floating point numbers but only ones designed for machine learning, or CPU extensions that are just even bigger vector multipliers for select datatypes (AMX).
Even DLSS only works great for some types of games.
Although there have been some clever uses of it, lots of games could gain a lot from proper efficiency of the game engine.
War Thunder runs like total crap on even the highest end hardware, yet World of Warships has much more detailed ships and textures running fine off an HDD and older than GTX 7XX graphics.
Meanwhile on Linux, Compiz still runs crazy window effects and 3D cube desktop much better and faster than KDE. It's so good I even recommend it for old devices with any kid of gpu because the hardware acceleration will make your desktop fast and responsive compared to even the lightest windows managers like openbox.
TF2 went from 32 bit to 64 bit and had immediate gains in performance upwards of 50% and almost entirely removing stuttering issues from the game.
Batman Arkham Knight ran on a heavily modified version of Unreal 3 which was insane for the time.
Most modern games and applications really don't need the latest and greatest hardware, they just need to be efficiently programmed which is sometimes almost an art itself. Slapping on "AI" to reduce the work is sort of a lazy solution that will have side effects because you're effectively predicting the output.
When a decent gpu is ~$1k alone, then someone wants you to pay more $ for a feature that offers no tangible benefit, why the hell would they want it? I haven’t bought a PC for over 25 years, I build my own and for family and friends. I’m building another next week for family, and AI isn’t even on the radar. If anything, this one is going to be anti-AI and get a Linux dual-boot as well as sticking with Win10, no way am I subjecting family to that Win11 adware.
The other 26% were bots answering.
16%
I'm fine with NPUs / TPUs (AI-enhancing hardware) being included with systems because it's useful for more than just OS shenanigans and commercial generative AI. Do I want Microsoft CoPilot Recall running on that hardware? No.
However I've bought TPUs for things like Frigate servers and various ML projects. For gamers there's some really cool use cases out there for using local LLMs to generate NPC responses in RPGs. For "Smart Home" enthusiasts things like Home Assistant will be rolling out support for local LLMs later this year to make voice commands more context aware.
So do I want that hardware in there so I can use it MYSELF for other things? Yes, yes I do. You probably will eventually too.
I wish someone would make software that utilizes things like a M.2 coral TPU, to enhance gameplay like with frame gen, or up scaling for games and videos. Some GPUs are starting to even put M.2 slots on the GPU, if the latency from Mobo M.2 to PCIe GPU would be too slow.
Bro, just add it to the pile of rubbish over there next to the 3D movies and curved TVs
The other 16% do not know what AI is or try to sell it. A combination of both is possible. And likely.
As with any proprietary hardware on a GPU it all comes down to third party software support and classically if the market isn't there then it's not supported.
Assuming theres no catch-on after 3-4 cycles I'd say the tech is either not mature enough, too expensive with too little results or (as you said) theres generally no interest in that.
Maybe it needs a bit of marturing and a re-introduction at a later point.
Unless you're doing music or graphics design there's no usecase. And if you do, you probably have high end GPU anyway
I could see use for local text gen, but that apparently eats quite a bit more than what desktop PCs could offer if you want to have some actually good results & speed. Generally though, I'd rather want separate extension cards for this. Making it part of other processors is just going to increase their price, even for those who have no use for it.
Not even on my phone
AI-en{hanced,shittified}
And the other 16% would also pay you $230 to hit them in the face with a shovel
Just need the right name for it. Soundblasters are still being produced aren't they? There's always a market.
Well yeah, because dedicated DACs have a tangible benefit of better audio. If you want better audio you need to buy a quality DAC and quality cans.
I also used to think it's dumb because who cares as long as you can hear. But then I built a new PC and I don't know if it was a faulty mobo or just unlucky setup but the internal DAC started picking up static. So I got an external DAC and what I noticed was that the audio sounded clearer and I could hear things in the sound that I couldn't hear before. It was magical, it's like someone added new layers into my favorite songs. I had taken the audio crack.
I pretty quickly gave away my DAC along with my audio technicas because I could feel the urge. I needed another hit. I needed more. I got this knawing itch and I knew I had to get out before the addiction completely took over. Now I live in static because I do not dare to touch the sun again.
Soundblasters may be shit but the hardware they're supposed to sell is legit, it has a tangible benefit to whomever can tell the difference. But with AI, what is the tangible benefit that you couldn't get by getting a better GPU?
Tbh this is probably for things like DLSS, captions, etc. Not necessarily for chatbots or generative art.
Predictable outcome, common tech company L.
The only reason I have any enthusiasm about CoPilot+ PCs (AI PCs or whatever new name they get in 6 months) is because of ARM and battery life.
Heck, I'll trade them all the AI features for no ads.
Poll shows 84% of PC users are suckers.
I feel like the sarcasm was pretty obvious in that comment, but maybe I'm missing something.
I would already like to buy a 4k TV that isn't smart and have yet to find it. Please don't add AI into the mix as well :(
Look into commercial displays
I was just thinking the other day how I'd love to "root" my TV like I used to root my phones. Maybe install some free OS instead
We got a Sceptre brand TV from Walmart a few years ago that does the trick. 4k, 50 inch, no smart features.
All TVs are dumb TVs if they have no internet access
I just disconnected my smart TV from the internet. Nice and dumb.
Signage TVs are good for this. They're designed to run 24/7 in store windows displaying advertisements or animated menus, so they're a bit pricey, and don't expect any fancy features like HDR, but they've got no smarts whatsoever. What they do have is a slot you can shove your own smart gadget into with a connector that breaks oug power, HDMI etc. which someone has made a Raspberry Pi Compute Module carrier board for, so if you're into, say, Jellyfin, you can make it smart completely under your own control with e.g. libreELEC. Here's a video from Jeff Geerling going into more detail: https://youtu.be/-epPf7D8oMk
Alternatively, if you want HDR and high refresh rates, you're okay with a smallish TV, and you're really willing to splash out, ASUS ROG makes 48" 4K 10-bit gaming monitors for around $1700 US. HDMI is HDMI, you can plug whatever you want into there.
I don't have a TV, but doesn't a smart TV require internet access? Why not just... not give it internet access? Or do they come with their own mobile data plans now meaning you can't even turn off the internet access?
Anti Commercial-AI license
I'm sure that's coming up.
As a yearly fee for DRMd televisions that require Internet access to work at all maybe
Right now it's easier to find projectors without it and a smart os. Before long tho it's gonna be harder to find those without a smart os and AI upscaling