Don't connect it to your wifi.
I have an LG C2 (circa 2019). I disabled everything I could through the menus. I never plugged in ethernet and had wifi disabled. One day my wife turned on the TV and it started doing an update. I flipped out on her for obviously having turned on wifi, how else could it update? Nearly (okay, not quite, but felt like it) got divorced over it, until I realized, no she really hadn't. The damn TV formed a mesh network with my neighbour's TV over bluetooth (I live in an apartment and apparently the concrete floor wasn't enough to block the signal).
These TVs will stop at nothing to phone home and get connectivity. When I phoned LG and freaked out at them, they told me there was some setting about 6 menus deep, totally unrelated to anything mesh or network connectivity that if I disabled would prevent it from connecting to other TVs. It was not obvious, not mentioned anywhere in the documentation, or in the many many pages of EULA that I actually read through when I first turned on my TV after purchase.
I want a TV that takes an HDMI (or DisplayPort) and displays it. No TV Tuner, built-in apps, or internet enabled anything. It aslo needs to support HDCP or else my AppleTV refuses to send video to it. Regular Monitors don't have HDCP support without being a smartTV. It's infuriating.
If it has to be thicker for battery life reasons, that's fine. It's going in a case with a holster anyway, and getting clipped to a holster on my belt. Thickness doesn't matter. Features matter. Width matters. Height matters. Usability matters.
I thought I was the only one that still used a holster! I found my people!
Also, fully agree about the TouchID being superior to FaceID. I wish Apple actually did proper market research, and talked to those of us that are willing to buy a 'pro' iphone, but don't want a bloody tablet. Give me a good camera, storage, battery life, and a 5S or 6S form factor.. Hell, I'd accept a 7! And to your point, it can be a few mm thicker, that's fine!
I find myself spending a lot of time with old consoles and computers that I still have from before games started having online components, lately. They were fun then, and they still are.
Spot on. I'm on a SNES binge, and once my new C64 comes in (apparently next month), I'm sure I'll be able to spend all my free time reliving my childhood from the 80's and 90's. I think the most modern game I actually enjoyed on the xbox was Peggle.
We're still in the early 2000s, we've only progressed about 2.4 % of the way to 3000. If you want to say "the 00s" then say "the 00s".
I also mined Bitcoin -- 350 BTC on a CPU, and sold it all for less than 10 cents apiece. Good times!
I'll take your trolling and troll back as follows:
When someone says early 2000s they mean the 00s. I have never seen your version, though you may be technically correct (the best kind of correct), I don't believe you are contextually correct. If we take your interpretation, put into context of the original sentence it was used in, it makes no sense. In context, it's clear the author was referring to decade that was from 2000 to 2009.
But that's okay, you sold 350 BTC for $35, the universe has punished you enough.
Bought pre-iPhone Apple stock or mined a bunch of Bitcoin in the early 2000's....
Umm... bitcoin didn't exist in the early 2000's. I believe it came into existence around 2008/2009.
I actually mined Bitcoin, I think I had something like 1.2 BTC back in 2011, then I lost the wallet. It was worth next to nothing at the time. Now it's enough to buy a nice car.
But the better analogy would have been to talk about BlockBuster not buying Netflix back in the day...
Well, according to the dictionary, "artificial intelligence" is:
the capability of computer systems or algorithms to imitate intelligent human behavior
Notice the presence of the word "imitate." Imitation implies that it is not the real thing. Like how imitation leather is not real leather. So, a machine does not need to be intelligent in order to imitate intelligent behavior.
The fact is, this is a very broad term that encompasses a wide range of computer behaviors, and so it is naturally going to be fuzzy around the edges. Sometimes, vagueness is precisely what makes a word useful.
You are technically correct, the best kind of correct!
That being said, just like Tesla's 'AutoPilot', just because the 'legal' definition says it's an assistant not true autonomous driving, does not mean that everyone doesn't look at the word and take it to mean autonomous driving. Same thing with AI. Artificial Intelligence is going to be taken at face value, intelligence that is artificial. Very few people know the dictionary definition, or that 'imitate' is a key word of said definition. That makes a big difference, but the vast majority of people assume AI means true intelligence.
Why else is Joe Q. Public freaking out that AI is going to take their jobs? We're treating AI like we treated BlockChain, and countless other 'going to change the world overnight' tech. It's a useful area, but it's mislabeled, overhyped and pulling resources away from other, dare I say, more important areas in the field of CS.
Resist RTO! Quit your job! they said. Funny how for 2 years I've been seeing lay offs after lay offs and most slashdotters still don't grasp that RTO is just not something they can resist.
When companies announce RTO mandates, they know that some people will quit out of protest, it's calculated into the cost of the mandate. Some companies are even using it as a form of layoff that saves them money since a person who quits doesn't get severance.
It's cyclic.
This particular cycle, the software sector bet big on hiring for "AI" applications, and then found out that the applications (inaccurately) called "AI" are more of a toy than a product that's going to make them money.
Wait for the next fad, they'll be hiring again.
I hate that they call it 'AI'. It is not intelligent. It is a predictive algorithm. Whether LLM, or some other base algorithm, it is not intelligent. Even the generative AI images are terrible as they cannot learn from their mistakes. In order for something to be called intelligent, it needs to be able to learn, and demonstrate that new knowledged. I have yet to see any AI do that.
Also, any 'AI' developer worth their salt will carefully avoid implying intelligence, they'll always lean back on the underlying algorithm to describe what they're working on. Good way to filter out the candidates that took a 6 hour course online and label themselves AI devs, versus actual talented devs that know what they're doing with an LLM for example.
My 2 cents, get into firmware development, if you are good at low level bit twiddling and debugging, you'll make a great career out of it.
Post Drug withdrawal recovery Peterson for sure. Something damaged the shit out of his brain.
There was a time when a fair amount of what he had to say was not too bad. Now he's modeling for the bad guy in some post apocalyptic horror show.
Look at things from his perspective, especially over the last few years. He is constantly under attack by very powerful entities (at least in his life), I'd likely get paranoid and grouchy too if the government, media, my professional college, etc, all turned on me.
Save energy: Drive a smaller shell.