this post was submitted on 22 Oct 2023
199 points (93.1% liked)

Google Pixel

5939 readers
21 users here now

The World's Google Pixel community!

This community is for lemmings to gather and discuss all things related to the Google Pixel phone and other related hardware. Feel free to ask questions, seek advice, and engage in discussions around the Pixel and its ecosystem.

We ask you to be polite when addressing others and respect Lemmy.world's rules.

NSFW content is not allowed and will immediately get you banned.

It also goes without saying that self-promotion of any nature and referral links are not allowed. When in doubt, contact the mod team first.

Also, please, no politics.

For more general Android discussions, see !android@lemmy.world.

This community is not in any way affiliated with Google. If you are looking for customer support regarding your Pixel phone, look here instead: https://support.google.com/pixelphone/

founded 1 year ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] adam@kbin.pieho.me 48 points 1 year ago (2 children)

ITT people who don't understand that generative ML models for imagery take up TB of active memory and TFLOPs of compute to process.

[–] hotdoge42@feddit.de 20 points 1 year ago* (last edited 1 year ago) (3 children)

That's wrong. You can do it on your home PC with stable diffusion.

[–] KLISHDFSDF@lemmy.ml 20 points 1 year ago

And a lot of those require models that are multiple Gigabytes in size that then need to be loaded into memory and are processed on a high end video card that would generate enough heat to ruin your phones battery if they could somehow shrink it to fit inside a phone. This just isn't feasible on phones yet. Is it technically possible today? Yes, absolutely. Are the tradeoffs worth it? Not for the average person.

load more comments (1 replies)
[–] AlmightySnoo@lemmy.world 3 points 1 year ago* (last edited 1 year ago)

You can for example run some upscaling models on your phone just fine (I mentioned the SuperImage app in the photography tips megathread). Yes the most powerful and memory-hungry models will need more RAM than what your phone can offer but it's a bit misleading if Google doesn't say that those are being run on the cloud.

[–] weew@lemmy.ca 46 points 1 year ago (1 children)

So much for the brilliant AI-specialized Tensor processor

It's basically just a mediocre processor that offloads interesting things to the mothership.

Yep. What a joke. Goes to show you that google could Make these pixel features available to all android devices if they wanted to.

[–] popekingjoe@lemmy.world 39 points 1 year ago

This really doesn't surprise me.

[–] Steeve@lemmy.ca 36 points 1 year ago (1 children)

Yeah, obviously. The storage and compute required to actually run these AI generative models is absolutely massive, how would that fit in a phone?

[–] mrbaby@lemmy.world 10 points 1 year ago

They aught to just slap a phone in a 3090

[–] Tb0n3@sh.itjust.works 26 points 1 year ago (1 children)
[–] Mojojojo1993@lemmy.world 6 points 1 year ago (2 children)
[–] Ashyr@sh.itjust.works 28 points 1 year ago (2 children)

Because it's a privacy nightmare.

[–] canis_majoris@lemmy.ca 19 points 1 year ago* (last edited 1 year ago) (1 children)

Using Google products has always been a "privacy nightmare" - it's not like this is some mega open source phone or anything it's literally Google's flagship. Is this really surprising? Playing with fire gets you burned.

[–] p5f20w18k@lemmy.world 15 points 1 year ago (1 children)

Googles phones are the easiest to de-google surprisingly

load more comments (1 replies)
[–] 9point6@lemmy.world 9 points 1 year ago (1 children)

Even ignoring all the privacy issues with that, it's kinda shit to unnecessarily lose phone features when you've got no signal

load more comments (1 replies)
[–] Mojojojo1993@lemmy.world 17 points 1 year ago (5 children)

Isn't that kinda the dream. We have devices that remote the os. So we get a super powerful device that keeps getting updated and upgraded. We just need a receiver ?

Isn't that what we want. Can reduce down the bulk on devices. Just a slab with battery, screen and modem and soc that can power the remote application ?

[–] botengang@feddit.de 45 points 1 year ago (2 children)

Sometimes that's what people dream about. On the other hand that hybrid cloud model is giving up the last remnants of computing autonomy and control over devices we own.

[–] Uli@sopuli.xyz 9 points 1 year ago

What I would like is something that gives me the framework to host my own server-side computations at home.

load more comments (1 replies)
[–] soulfirethewolf@lemdro.id 7 points 1 year ago (1 children)

That would be, if Google wasn't constantly killing things that didn't do good enough. Especially given how expensive generative AI can be to run remotely. Just look at what happened with Stadia

Also, it just feels disappointing. Ever since chatGPT, they've been pouring near infinite budget into stuff like this by hiring the top talent, and working them to the latest hours of the night. And the best that they could come up with for the pixel 8 is feeding it data from the cloud.

And I can't even really believe the whole "consumer hardware isn't powerful enough" thing given that there's quite a few ARM processors, Apple especially, That's been able to build consumer hardware capable of performing generative AI (I've personally been able to run stable, diffusion and whisper on my M1 MacBook). Maybe not at the scale or quality of the cloud, but still capable of doing so regardless.

load more comments (1 replies)
[–] amenotef@lemmy.world 7 points 1 year ago* (last edited 1 year ago) (5 children)

I mean it sucks for offline situations or for actions that need very good latency.

But the phone's battery would be happier if the processing is not done locally.

For some things I prefer to do the stuff locally, for other things on the cloud.

Some cloud Vs local examples I'm thinking about:

  1. For example (example not related to the Pixel) if I'm generating an image with stable diffusion at home I prefer to use my RX 6800 on my private local Linux rather than a cloud computer with a subscription. But if I had to do the same on a mobile phone with tiny processing power and battery capacity I'd prefer to do it on the cloud.

  2. (Another non AI example): for gaming, I prefer to run things natively. At least until it is possible to stream a game without added latency. Obviously I don't play games on a phone.

  3. Another example (Google photos): I would prefer to connect Google photos to a local NAS server at home, and "own the storage on premise" and then pay a lower fee to Google photos for the extra services it brings. Browsing, face and objects recognition, etc. But this is not an option. With this I would be able invest on my own storage, even if I had dozens of gigabytes in 4K60FPS videos.

load more comments (5 replies)
[–] dustyData@lemmy.world 3 points 1 year ago (7 children)

Obviously, I mean, Google did so well with Stadia.

/s

load more comments (7 replies)
load more comments (1 replies)
[–] BlovedMadman@lemmy.world 16 points 1 year ago* (last edited 1 year ago) (2 children)

This just strengthens the argument to install privacy/security first operating systems like CalyxOS and GrapheneOS. I don't was a phone that's more a service subscription than it is hardware. I have the pixel 8 and didn't get the pro due to the offloading to Google servers for some "features".

Just waiting for GrapheneOS to be released for the 8... Until then, I'm sitting uncomfortable knowing my phone is uploading telemetry to Google servers...

[–] ReginaPhalange@lemmy.world 7 points 1 year ago (1 children)

I've bitten the bullet, replaced my Redmi 13 to a Pixel 7 + GrapheneOS - because the MIUI spyware and google spyware are just too much... I still don't get why , as an owner of a phone , I'm not the owner of the phone.

[–] Onii-Chan@kbin.social 4 points 1 year ago (1 children)

Currently using GrapheneOS on my 8 Pro, and the experience has been extremely smooth so far.

[–] BlovedMadman@lemmy.world 2 points 1 year ago

Yea I was tempted for a second to get it, but there was a comment that it "should" be possible to upgrade to the released version when available, I don't want to have to mess around in the event that it's not. So I'm just going to wait a week or so.

[–] Swarfega@lemm.ee 9 points 1 year ago (1 children)

People didn't think this was the case already?

load more comments (1 replies)
[–] Sargteapot 4 points 1 year ago
load more comments
view more: next ›