this post was submitted on 13 Nov 2023
24 points (100.0% liked)

TechTakes

1430 readers
120 users here now

Big brain tech dude got yet another clueless take over at HackerNews etc? Here's the place to vent. Orange site, VC foolishness, all welcome.

This is not debate club. Unless it’s amusing debate.

For actually-good tech, you want our NotAwfulTech community

founded 1 year ago
MODERATORS
top 19 comments
sorted by: hot top controversial new old
[–] sc_griffith@awful.systems 8 points 1 year ago (1 children)

this is such a funny grift. hope ceos are torturing themselves over whether the random noise interpreters will like them. imagine an exec staring in the mirror repeating a line over and over to develop the right intonation to fool ai

[–] Send_me_nude_girls@feddit.de 7 points 1 year ago (1 children)

Just train a model with your voice and never speak a real word on your own ever again. Call it voice purists. It's going to happen.

[–] sc_griffith@awful.systems 11 points 1 year ago (1 children)

I'm sorry, but that won't help your earnings call. As soon as you give it a few microseconds of voice data, the model will simulate your life from first principles and find out your company is fucked. you think the ai is going to throw that information away? every exquisite subvocal pang of agony will be reproduced. there's only one thing to do. the only way out is through. show up so blitzed out on coke you don't even know you're in an earnings call. you have to do it. it's called charging the fucking machine gun nest man. our grandparents knew about this before they got all wrapped up in this tech shit. that's what they taught you in world war two. they didn't even know what a phone was back then. can you imagine? that's fucking wild man. and now you have chatgpt and it's smarter than half the people I know. that's fucking wild. life! chatgpt. how do I buy a machine gun

[–] self@awful.systems 4 points 1 year ago

ah, you’ve known some of the same type of idiot executives I have

[–] bitofhope@awful.systems 8 points 1 year ago

LLMs are so notoriously terrible at telling truth from lies that "AI hallucination" is a household phrase at this point, for better or for worse. But surely they work even better when asked to rate the truthfulness of things that are not in their corpus to begin with.

[–] sue_me_please@awful.systems 8 points 1 year ago* (last edited 1 year ago)

What about an AI that can tell if that cute candidate our startup hired will sleep with me or if she'll just lie and say yes and then tell HR?

And while we're at it can we make an LLM that will force my kids to call me?

[–] locallynonlinear@awful.systems 6 points 1 year ago (1 children)

We need to filter people who exhibit voice stress, because no one likes a person with the humility of taking uncertainty seriously.

[–] dgerard@awful.systems 4 points 1 year ago

we need to filter anyone who uses earnings calls for anything other than comedy

[–] toychicken@mastodon.social 6 points 1 year ago

@dgerard I mean, god forbid VCs did even the most basic due diligence. No, we'll use AI to tell if this good news is true or not!

[–] self@awful.systems 6 points 1 year ago (1 children)

is there a pseudoscience that VCs and promptfans aren’t trying to turn into a startup? we’ve got medical woo everywhere, AI startups are essentially repackaging everything from race science to mediums into a bullshit product, and now we’ve got this superstitious crap. there’s a drinking game somewhere in all this where you pull a random RW page and take a shot if there isn’t a startup trying to monetize the article’s subject

[–] dgerard@awful.systems 5 points 1 year ago

there’s a drinking game somewhere in all this where you pull a random RW page and take a shot if there isn’t a startup trying to monetize the article’s subject

perfect

[–] froztbyte@awful.systems 5 points 1 year ago

I'd register fuckedcompany.ai but I happened to discover some years back that .ai didn't allow saying fuck in the domain name. goddamn tyranny

but there's some real revivalist potential for fuckedcompany in all this dogshit

[–] naevaTheRat@lemmy.dbzer0.com 5 points 1 year ago (1 children)

This is dousing again. I'm so glad nothing ever changes.

[–] naevaTheRat@lemmy.dbzer0.com 7 points 1 year ago

Bwahaha wifey points out that if only they had what this product purports to be before buying it.

[–] saucerwizard@awful.systems 3 points 1 year ago (1 children)
[–] carlitoscohones@awful.systems 6 points 1 year ago

I'm imagining that last Tesla earnings call with Musk holding 2 soup cans in a flop sweat.

[–] swlabr@awful.systems 2 points 1 year ago (1 children)

Random thought: earnings calls are like streams. Buying/selling stock is subbing/unsubbing. Asking questions is superchatting/donating with a message. AI sentiment analysis is crazed fans hyperanalysing the stream to confirm whatever conspiracy they have about the streamer.

NB: i don’t partake in stream culture

[–] dgerard@awful.systems 5 points 1 year ago (1 children)

an earnings call is very like a stream, yes

[–] swlabr@awful.systems 5 points 1 year ago

Just as cringe, for sure