My use case will be to remove the looping bland house music from tech product demos. I really don’t understand why every tech company needs to make me feel like I’m at a club when I’m just there to learn about a new feature in their product and how I can use it.
Sure, but $120k is definitely not FAANG-tier base comp in SF. Not even close. Maybe it’s on the low side of scrappy startup/scaleup comp.
The UPS driver that delivers to my home office a bag of electronic goodies every week couldn’t care less about what OS I use. I mean I even tried to tell him about all the awesome Minty Pops and Arches and all he had to say was “that Fedora looks fucking dope, bro. Say, do you listen to Hannah Montana?”
I use cloud computing to run a lot of my computer stuff. Not a PC. I self-host some services on a home-server. Also not a PC. I can install a GUI on these if I want and RDP into them, still doesn’t make these PCs.
I can use my personal laptop as a server if I want (and I have!) with remote-access enabled; so it is both a PC and a not-PC?
I think we have to settle on PC being usecase-driven; not hardware-defined. Which is what I think you were trying to get at, but abstracting too far.
That’s fair. I see what I see at an engineering and architecture level. You see what you see at the business level.
That said. I stand by my statement because I and most of my colleagues in similar roles get continued, repeated and expanded-scope engagements. Definitely in LLMs and genAI in general especially over the last 3-5 years or so, but definitely not just in LLMs.
“AI” is an incredibly wide and deep field; much more so than the common perception of what it is and does.
Perhaps I’m just not as jaded in my tech career.
operations research, and conventional software which never makes mistakes if it’s programmed correctly.
Now this is where I push back. I spent the first decade of my tech career doing ops research/industrial engineering (in parallel with process engineering). You’d shit a brick if you knew how much “fudge-factoring” and “completely disconnected from reality—aka we have no fucking clue” assumptions go into the “conventional” models that inform supply-chain analytics, business process engineering, etc. To state that they “never make mistakes” is laughable.
Absolutely not true. Disclaimer, I do work for NVIDIA as a forward deployed AI Engineer/Solutions Architect—meaning I don’t build AI software internally for NVIDIA but I embed with their customers’ engineering teams to help them build their AI software and deploy and run their models on NVIDIA hardware and software. edit: any opinions stated are solely my own, N has a PR office to state any official company opinions.
To state this as simply as possible: I wouldn’t have a job if our customers weren’t seeing tremendous benefit from AI technology. The companies I work with typically are very sensitive to CapX and OpX costs of AI—they self-serve in private clouds. If it doesn’t help them make money (revenue growth) or save money (efficiency), then it’s gone—and so am I. I’ve seen it happen; entire engineering teams laid off because a technology just couldn’t be implemented in a cost-effective way.
LLMs are a small subset of AI and Accelerated-Compute workflows in general.
We’re looking at this from opposite sides of the same coin.
The NN graph is written at a high-level in Python using frameworks (PyTorch, Tensorflow—man I really don’t miss TF after jumping to Torch :) ).
But the calculations don’t execute on the Python kernel—sure you could write it to do so but it would be sloooow. The actual network of calculations happen within the framework internals; C++. Then depending on the hardware you want to run it on, you go down to BLAS or CUDA, etc. all of which are written in low-level languages like Fortran or C.
Numpy fits into places all throughout this stack and its performant pieces are mostly implemented in C.
Any way you slice it: the post I was responding to is to argue that AI IS CODE. No two ways about that. It’s also the weights and biases and activations of the models that have been trained.
deleted by creator
Neural nets are typically written in C; then frameworks abstract on top of that (like Torch, or Tensorflow) providing higher-level APIs to languages like (most commonly) Python, or JavaScript.
There are some other nn implementations in Rust, C++, etc.
Costco’s soft-serve is way better than McD’s and actually is cheap.
Bullshit. Developers never make mistakes. N.E.V.R.
No, but it’s basically a “I can use it to build my billion-dollar business and keep the profits if I want” license. The only real catch is that if I decide to modify the code and distribute it, I’m required by the license to share those changes with whoever gets the modified version. There’s nothing in the GPL that stops me from being a downstream freeloader, and I can stay on whatever version I like—no one’s forcing me to update to newer ones with terms I don’t agree with. Forking and modifying for my own needs is totally fine, as long as I slap the same GPL on the changes if I hand them out.
You can scan before the encryption step. It defeats the purpose of the encryption such that only the privileged actor gets plaintext while everyone downstream gets encrypted bytes, but technically it’s possible.
It’s only a matter of time until a vulnerability in the privilege is found and silently exploited by a nefarious monkey, and that’s precisely why adding backdoors should never be done.
Can’t wait for Nintendo to sue Microsoft because VS Code can be used to edit save files.
The funny thing here is that there are many good distributions that are based on Ubuntu. I’m a Pop!_OS fanboy, many of my colleagues enjoy Mint. Yet, almost everyone I know in the Linux world despises Ubuntu.
If I want to wear my sunglasses while I’m watching a movie in the cinema because I have a light-sensitivity condition—usagenof the sunglasses alters my perception of the film without changing the permanent media storage of the film—am I cheating and subject to copyright infringement action?
Stop giving me Thermo nightmares; I lived through that shit already I don’t need to sleep through it too.
And you just reminded me of a movie from a show and let’s just say I recommend both.
https://youtu.be/X1DcKBwliAw