

“Preorders” were a fully refundable $100. The overwhelming majority of preorders canceled.
“Preorders” were a fully refundable $100. The overwhelming majority of preorders canceled.
If we had one more season of Silicon Valley, Russ would absolutely be rubbing finger prints out of his Founder Edition Cyber Truck.
Looking at the 2024 sales numbers of the lighting and R1T, the Cybertruck is selling well comparatively.
That said, I can’t imagine that the look of the truck, and the actions of Tesla’s CEO, are helping sales in anyway.
This article doesn’t have any sources, and all of the sites talking about this are pretty unreliable. I’ll believe it’s real when the reliable rumor sites pick it up.
That said, Apple Intelligence has three tiers of prompt processing. Apple’s private on-device model, Apple’s private cloud model for more complex prompts, and external model integration for the most complex prompts processing. The latter is model agnostic.
Apple launched with ChatGPT as an optional integration, but in theory, it could plug into whatever. Gemini, DeepSeek, etc.
Apple will probably just swap GPT for DeepSeak in China, just like they swap Google for Baidu with search.
Fuck the big tech companies and all, but I don’t buy the argument that there is no competition in the US. If you believe that, you’re not paying attention to the space. There are a fuckload of weird models being developed in the US. Some by big players, and some by smaller companies.
IMHO, this is the same thing that happens with every new big advancement. PCs, internet, mobile, etc. People invest a shit load of money in the early players, then a ton of those early investments don’t pan out.
And often times, the people that really stand out are the smaller disrupters or the companies that come in a little later.
The thing will bullet point 1 is that finding exploits is becoming MUCH easier with LLMs. That said, it’s now arms race. Can you deploy AI to pressure test your systems and find the gaps before the bad actors do the same?
Wait until I show them my PHP BB.
Taps “learn more”
I just left because the platform is full of spam, narcissists and crazy family members that I don’t need in my life. It’s not fun anymore. Hasn’t been for a decade.
My only problem is that matter support usually means “basic functionality” for your IOT devices. On / off, hot / cold, etc. You typically still have to create an account with a proprietary app to configure more nuanced things. For example, the shape and soil characteristics of my irrigation zones, the motion section trigger areas of my cameras, the fade and trim setting of my light switches, etc.
I don’t know how you sort this mess out. I’d love to never install a 3rd party app for IOT stuff, but I don’t see that happening anytime remotely soon.
The problem isn’t the alert itself, it’s that cops put Twitter links in the alert. If you want to see what the car, suspect, or victim look like, you need to be able to access Twitter.
Police have been doing this for years now. It’s a fast a cheap way to microblog without buying or supporting something with the city’s budget.
“Copyright industry” is such a weird term. Why not use the term everyone already knows, media companies.
True, although shady Amazon sellers will probably still sell you a fire hazard cord that isn’t to spec.
That’s why Apple charges an arm and a leg for RAM.
IMHO, they have pretty different use cases. For example, I can’t use a search engine to compose things.
Weird. New installs usually get some sort on onboarding screen that explains how to activate the new stuff.
The 18.2 Chat GPT stuff can be manually enabled under settings > Apple intelligence > scroll way down > Chat GPT. Once enabled, writing tools and Siri will give you the option to send a query to ChatGPT instead of Apple’s model.
If Siri gets stumped, it will ask if you want to query GPT. Or you can just prompt it to Ask Chat GPT ______.
Writing tools has it buried under “compose” which is at the very bottom of the writing tools sheet.
Agreed.
IMHO, the only truly useful thing is writing tools and Siri being able to query ChatGPT for complex questions instead of telling people to pull out their phone and search the web.
The stuff everyone was actually interested in is likely in 18.4. On-screen awareness, integration with installed apps, contextual replies, etc.
Probably worth noting, this survey was taken before 18.2 went live with a ChatGPT integration, image generation, etc.
Now filter by display technology.