this post was submitted on 18 Jul 2024
804 points (99.5% liked)

Technology

59761 readers
3295 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 

Companies are going all-in on artificial intelligence right now, investing millions or even billions into the area while slapping the AI initialism on their products, even when doing so seems strange and pointless.

Heavy investment and increasingly powerful hardware tend to mean more expensive products. To discover if people would be willing to pay extra for hardware with AI capabilities, the question was asked on the TechPowerUp forums.

The results show that over 22,000 people, a massive 84% of the overall vote, said no, they would not pay more. More than 2,200 participants said they didn't know, while just under 2,000 voters said yes.

you are viewing a single comment's thread
view the rest of the comments
[–] magiccupcake@lemmy.world 6 points 4 months ago (2 children)

Most people have pretty decent ai hardware already in the form of a gpu.

Sure dedicated hardware might be more efficient for mobile devices, but that's already done better in the cloud.

[–] PriorityMotif@lemmy.world 4 points 4 months ago

Google coral TPU has been around for years and it's cheap. Works well for object detection.

https://docs.frigate.video

There's a lot of use cases in manufacturing where you can do automated inspection of parts as they go by on a conveyor, or have a robot arm pick and place parts/boxes/pallets etc.

Those types of systems have been around for decades, but they can always be improved.

[–] Nomecks@lemmy.ca 3 points 4 months ago (1 children)

It's not really done better in the cloud if you can push the compute out to the device. When you can leverage edge hardware you save bandwidth fees and a ton of cloud costs. It's faster in the cloud because you can leverage a cluster with economies of scale, but any AI company would prefer the end-user to pay for that compute instead, if they can service requests adequately.

[–] AdrianTheFrog@lemmy.world 1 points 4 months ago

Yeah, you also have to deal with the latency with the cloud, which is a big problem for a lot of possible applications