Maybe it's like the dotcom bubble: there is genuinely useful tech that has recently emerged, but too many companies are trying to jump on the bandwagon.
LLMs do seem genuinely useful to me, but of course they have limitations.
This is a most excellent place for technology news and articles.
Maybe it's like the dotcom bubble: there is genuinely useful tech that has recently emerged, but too many companies are trying to jump on the bandwagon.
LLMs do seem genuinely useful to me, but of course they have limitations.
I am old enough to remember when the CEO of Nortel Networks got crucified by Wall Street for saying in a press conference that the telecom/internet/carrier boom was a bubble, and the fundamentals weren't there (who is going to pay for long distance anymore when calls are free over the internet? where are the carriers-- Nortel's customers-- going to get their income from?). And 4 years later Nortel ceased to exist. Cisco crashed too, though had enough TCP/IP router biz and enterprise sales to keep them alive even until today.
This all reminds me of the late 1990s internet bubble rather than the more recent crypto bubble. We'll all still be using ML models for all kinds of things more or less forever from now on, but it won't be this idiotic hype cycle and overvaluation anymore after the crash.
Shit, crypto isn't going anywhere either, it's a permanent fixture now, Wall Street bought into it and you can buy crypto ETFs from your stockbroker. We just don't have to listen to hype about it anymore.
Crypto is still just as awful as it ever was IMO. Still plenty of assholes ~~gambling~~ investing in crypto.
Well put.
Soon, it won't be this idiotic hype cycle, but it'll be some other idiotic hype cycle. Short term investors love hype cycles.
We just don’t have to listen to the hype about it anymore.
True, it’s now in most circles just been mixed in as a commodity to trade on. Though I wish everyone would get that. There’s still plenty of idiots with .eth usernames who think there’s some new boon to be made. The only “apps” built on crypto networks were and are purely for trading crypto, I’ve never seen any real tangible benefit to society come out of it. It’s still used plenty for money laundering, but regulators are (slowly) catching up. And it’s still by far the easiest way to demonstrate what happens to unregulated markets.
Crypto has been turned into gold by wallstreet, they bought up enough of it to jot be completely exposed, it's supply is extremely limited and will run out. Putting your money into it is no different than putting it into gold, you might catch a good moment and buy in low and get some return, but most wont.
Putting your money into it is no different than putting it into gold
Sorry kiddo, putting your money into crypto is very, very different to putting it into gold.
Well yeah, it's easier to steal it if you click on a link you shouldnt.
The supply is absolutely more like unlimited lol.
Not enough btc? Make lite coin! Etc etc etc
No one cares about lite coin though which defeats the purpose.
That's like saying US Dollars are Unlimited because you can always buy Zimbabwe Dollars...
idk why baidu requires a account to download from it.
Wow, a CEO who doesn’t buy into the hype? That’s astonishing.
I, for one, cannot wait for the bubble to burst so we can get back to some sense of sanity.
Edit>> Though if Baidu is investing in AI like all the rest, then maybe they just think they’ll be immune — in which case I’m sad again that I haven’t yet come across a CEO who calls bullshit on this nonsense.
AI will have its uses, and it has practical use cases such as helping people to walk or to speak or to translate in real time, etc. But we’re decades away from what all these CEOs seem to think they’re going to cash in on now. And it’ll be fun on some level watching them all be wrong.
10 to 30? Yeah I think it might be a lot longer than that.
Somehow everyone keeps glossing over the fact that you have to have enormous amounts of highly curated data to feed the trainer in order to develop a model.
Curating data for general purposes is incredibly difficult. The big medical research universities have been working on it for at least a decade, and the tools they have developed, while cool, are only useful as tools too a doctor that has learned how to use them. They can speed diagnostics up, they can improve patient outcome. But they cannot replace anything in the medical setting.
The AI we have is like fancy signal processing at best
Not an expert so I might be wrong, but as far as I understand it, those specialised tools you describe are not even AI. It is all machine learning. Maybe to the end user it doesn't matter, but people have this idea of an intelligent machine when its more like brute force information feeding into a model system.
Don't say AI when you mean AGI.
By definition AI (artificial intelligence) is any algorithm by which a computer system automatically adapts to and learns from its input. That definition also covers conventional algorithms that aren't even based on neural nets. Machine learning is a subset of that.
AGI (artifical general intelligence) is the thing you see in movies, people project into their LLM responses and what's driving this bubble. It is the final goal, and means a system being able to perform everything a human can on at least human level. Pretty much all the actual experts agree we're a far shot from such a system.
It may be too late on this front, but don't say AI when there isn't any I to it.
Of course it could be successfully argued that humans (or at least a large amount of them) are also missing the I, and are just spitting out the words that are expected of them based on the words that have been ingrained in them.
AI as a field of computer science is mostly about pushing computers to do things they weren't good at before. Recognizing colored blocks in an image was AI until someone figured out a good way to do it. Playing chess at grandmaster levels was AI until someone figured out how to do it.
Along the way, it created a lot of really important tools. Things like optimizing compilers, virtual memory, and runtime environments. The way computers work today was built off of a lot of things out of the old MIT CSAIL labs. Saying "there's no I to this AI" is an insult to their work.
This is not up to you or me : AI is an area of expertise / a scientific field with a precise definition. Large, but well defined.
Intelligence: The ability to acquire, understand, and use knowledge.
A self-driving car is able to observe its surroundings, identify objects and change its behaviour accordingly. Thus a self-driving car is intelligent. What's driving such car? AI.
You're free to disagree with how other people define words but then don't take part in their discussions expecting everyone to agree with your definiton.
And they will ALL deserve it.
No bubble has deserved to pop as much as AI deserves to
Blockchain and crypto were worse. „AI” has some actual use even if it’s way overblown.
I'm glad you didn't say NFTs because my Bored Ape will regain and triple its value any day now!
Bro the GME short squeeze is going to hit any day now. We're going to be millionaires bro, you just wait
Creating a specialized neural net to perform a specific function is cool. Slapping GPT into customer support because you like money is horse shit and I hope your company collapses. But yeah you're right. Blockchain was a solution with basically no problems to fix. Neural nets are a tool that can do a ton of things, but everyone is content to use them as a hammer.
Yes! "AI" defined as only LLMs and the party trick applications is a bubble. AI in general has been around for decades and will only continue to grow.
Crypto has a legitimate value, you can buy drugs with it.
Honestly kinda miss when the drugs I did were illegal. I used to buy weed from this online seller that was really into designer drugs. The amount of time I used to spend on Erowid just to figure out wtf I was about to take.
Aw, only 99%?
As a major locally-hosted AI proponent, aka a kind of AI fan, absolutely. I'd wager it's even worse than crypto, and I hate crypto.
What I'm kinda hoping happens is that bitnet takes off in the next few months/years, and that running a very smart model on a phone or desktop takes milliwatts... Who's gonna buy into Sam Altman $7 trillion cloud scheme to burn the Earth when anyone can run models offline on their phones, instead of hitting APIs running on multi-kilowatt servers?
And ironically it may be a Chinese company like Alibaba that pops the bubble, lol.
If bitnet takes off, that’s very good news for everyone.
The problem isn’t AI, it’s AI that’s so intensive to host that only corporations with big datacenters can do it.
The fuck is bitnet
https://www.microsoft.com/en-us/research/publication/bitnet-scaling-1-bit-transformers-for-large-language-models/ use 1 bit instead of 8 or 16, yay performance gainz
So will the return of the flag conclude the adventures of ressource usage in computers?
Not shocked. It seems the tech bros like to troll us every few years.
The tech bros are selling, but it's the VCs that are fueling this whole thing. They're grasping for the next big thing. Mostly they don't care if any of it actually works, as long as they can pump share value and then sell before it collapses.