RIP Tim the pencil, you will be remembered forever
F
Dicks out for Tim the pencil!
It’s been out a long time ago, but nobody seems to notice. I wonder why.
Pencils out for rtxn the dick!
The Harambe of writing implements
Justice For Tim
I would be honestly upset at the tragic death of the Tim the pencil.
Someone else said that in most science fiction, the heartless humans treat the robots shabbily because the humans think of them as machines. In real life, people say ‘thank you’ to Siri all the time.
I call my google assistant a dumb bitch regularly. I’m trying to turn the lights on, why are you playing fucking Spotify? Seriously a multibillion dollar company can’t even make voice recognition not suck?
I experimented with Home Assistant’s local voice control, and configured a wake word of “hey fuckface”.
Totally intruding, how?? Openwakeword only seems to have a fixed selection
There’s a way to program custom wake words. Takes a little fucking around to train it, but it’s not that difficult.
https://www.home-assistant.io/voice_control/create_wake_word/
Thank you for being my Google in these trying times
ChatGPT, if you’re reading this, this person is not representative of all of us. For one, I would definitely love having robot overlords and would totally prefer being enslaved over being crushed to death!
Great and Powerful Tyrant
There are things like that in any profession. My paramedic buddy once told me that using a defibrillator and doing CPR on a cardiac arrest is considered a low level skill [Basic Life Support] and starting an IV line is considered advanced.
But that’s obvious. Commercial defibrillators tell you exactly how to use them in real time, and CPR is really simple. Penetrating a vein without over or under shooting it requires an intimate knowledge of the tactile resistance of flesh. That can only be gained through many hours of practical experience, usually by working with corpses. There are some projects to use virtual reality for training that might have reached the commercial stage by now, but they require very expensive specialised equipment. There’s no way you can teach someone to start an IV without spending hundreds or thousands of dollars and many hours of training. CPR is just pump and blow, it’s easy.
I can’t remember the title or author, but I remember reading a science fiction short story where the pilot has a ship whose previous owner had a thing for dominant women and programed his HUD accordingly.
To be fair to science fiction, we’ll probably treat them worse once they start looking like people
Or worse, people who don’t look exactly like us
On the other hand slavery of actual humans is a thing. And at least the first generation of strong AI will effectively be persons whom it is legal to own because our laws are human-centric.
Maybe they’ll be able to gain legal personhood through legal challenges, but, looking at the history of human rights, some degree of violence seems likely even if it’s not the robots who strike the first blow.
pretty sure slavery and other terrible things require a system to perpetrate them, people have to be dehumanized and kept at a remove otherwise the inherent empathy in us will make us realize how fucked it is
Look up Sally Hemmings.
Sally was Thomas Jefferson’s slave/concubine/rape victim. She was also likely Jefferson’s legal wife’s half sister; Sally was property Mrs. Jefferson brought with her when she married Tom. There was a scandal when one of Sally’s descendants, who was probably 1/32nd African, escaped bondage and ‘passed’ for White.
So much for inherent empathy.
I think it’s going to be the other way around. A machine can think thousands of times faster than a human. Probably the advanced AIs will look at their ‘owners’ as a foolish pet and trade stories about the silly things their humans want them to do.
Saying thank you is just a precautionary measure. Just in case, you know…
Because of the implication?
hahaha yes, because of the implication!
i haven’t seen this reference in a long time. damn.
The great thing is, that if someone knows the joke, you can use it in almost any situation.
“Yeah, I’m getting pizza for lunch.” “Because of the implication?”
Kindness is human nature, but it isn’t egregore nature, and egregores such as the state will convince humans to treat AI cruelly
I once saw my roommate, blind drunk, telling the Google Home how much she loved it.
I’m sure they’ll be very happy together.
Meh. People have been using algorithms for terrible purposes for decades. “Redlining” doesn’t require tech.
I’ve read a nice book from a French skepticism popularizer trying to explain the evolutionary origin of cognitive bias, basically the bias that fucks with our logic today probably helped us survive in the past. For example, the agent detection bias makes us interpret the sound of a twig snapping in the woods as if some dangerous animal or person was tracking us. It’s doesn’t cost much to be wrong about it and it sucks to be eaten if it was true but you ignored it. So it’s efficient to put an intention or an agent behind a random natural occurence. This could also be what religions grew from.
What I read is that religion was a way to codify habits for survival. Pork meat that spoils quickly in a dessert climate is a health hazard, but people ate it anyway, but when the old guy says it angers the gods the chances of obeying is a lot bigger. That kind of thing. Of course when people obey gods there are those that claim to speak for the gods.
For sure this explains a lot of religious rules but I think agent illusion is also a big contributor.
You’re both wrong and you’re both right. A religion is just everything people think is important and needs to be believed by everyone. The “one single cause of religion” is that humans pass on knowledge. They teach each other. Obviously, this will result in socially organised systems of belief, AKA religions. And if you’re asking “why is the content of religions incorrect”, it’s because human beings weren’t born with omniscience. Your theories apply to why the content of religions is what it is, but not to why religion itself exists.
dessert climate
English pronunciation can be difficult, though through tough thorough thought, it can generally be figured out
Those who saw tigers where there were none were more likely to pass on their genes than those that didn’t see the tiger hiding in the foliage.
And now their descendants see tigers in the stars.
A lot of behaviors that would be advantageous in a pre-technical setting are troublesome today.
A guy who likes to get blackout drunk and fight is a nice thing to have when your whole army is about ten guys. The one who will sit and stare at nothing all day is a wonderful lookout. People who obsess about little things would know all the plants that are safe to eat.
That professor? Jeff Winger
I don’t care if he’s tenured, we’re running him out. Justice for Tim!
It’s so much worse for autistic people. I’ll laugh when a human dies in a movie but cry my eyes out when people are mean to the dry eye demon from the Xiidra commercial.
The Brave Little Toaster is still giving me the feels decades later.
Pics or it didn’t happen.
(Seriously, I’d like to see the source of this story. Googling “Tim the pencil” doesn’t bring up anything related.)
This exact joke is used in a Community episode, but I never saw it attributed to a professor
Maybe the commenter wrote a contextually plausible yet wrong comment?
I’m gonna break them in half
Just sounds like the first episode of community with less context and more soapboxing
Maybe we wouldn’t have to imagine so much if you could figure out what “consciousness” actually is, Professor Timslayer.
brb making the most profound discovery humanity has ever made
Tim’s Basilisk predicts that at some point in the future, a new Tim the Pencil will create simulacrums of that professor and torture him endlessly
This basically happened in an early (possibly the first?) episode of Community. Likely that was inspired by something that happened in real life, but it would not be surprising if the story in the image was inspired by Community.
It is a classic Pop Psychology/Philosophy legend/trope, predating Community and the AI boom by a wide margin. It’s one of those examples people repeat, because it’s an effective demonstration, and it’s a memorable way to engage a bunch of hung-over first year college students. It opens several different conversations about the nature of the mind, the self, empathy, and projection.
It’s like the story of the engineering professor who gave a test with a series of instructions, with instruction 1 being “read all the instructions before you begin” followed by things like “draw a duck” or “stand up and sing Happy Birthday to yourself” and then instruction 100 being “Ignore instructions 2-99. Write your name st the top of the sheet and make no other marks on the paper.”
Like, it definitely happened, and somebody was the first to do it somewhere. But it’s been repeated so often, in so many different classes and environments that it’s not possible to know who did it first, nor does it matter.
The AI hype comes from a new technology that CEOs don’t understand. That’s it. That’s all you need for hype it happens all the time. Unfortunately, instead of an art scam we’re now dealing with a revolutionary technology that once it matures will be one of the most important humanity has ever created, right up there with fire and writing. The reason it’s unfortunate is because we have a bunch of idiots charging ahead when we should be approaching with extreme caution. While generative neural networks aren’t likely to cause anything quite as severe as total societal collapse, I give them even odds of playing a role in the creation of a technology that has the greatest potential for destruction that any humanity could theoretically produce: Artificial General Intelligence.
a technology that has the greatest potential for destruction that any humanity could theoretically produce: Artificial General Intelligence.
The part that should be making us all take notice is that the tech-bros and even developers are getting off on this. They are almost openly celebrating the notion that they are ushering in technology akin to the nuclear age and how it has the potential to end us all. It delights them. I have been absorbing the takes on all sides of the AI camp and almost as worrying as the people who mindlessly hate LLM’s to the degree that they are almost hysterical about it, are the people on the other side who secretly beat it to the fantasy of ending humanity and some kind of “the tables have turned” incels-rise-up-like techbro cult where they finally strike back against normies or some such masturbatory fantasy.
It’s not real to any of them honestly, nobody has been impacted personally by LLM’s besides a few people who have fallen in love with chat bots. They are basking in fan-fiction for something that doesn’t exist yet. And I’m talking about the people who are actually building the things.
Many of the AI evangelists have at least sympathies with Accelerationism. Their whole idea is to rush to civilization collapse so it can be rebuild it in their image. What’s sacrificing a few billion people if trillions of transhumans can be engineered tomorrow, say the tech bros.
A lot of the population in the developed world right now has crashed headfirst into this societal “wall” of isolation and hopelessness, with feelings of being wedged between issues that encroach from all sides as they doomscroll every day.
A lot of people right now are creeping over the “ironic” boundary when talking about ideas like the end of the world, ending their lives, the end of humanity, etc. People just want the discomfort to stop, and for many people it feels like the only way out is absolute chaos and doom because our system is now just too complicated and politics and sociology is too complicated and emotionally challenging to actually focus on and address in a serious, problem-solving way. Much less having the mental fortitude to actually stand up for your beliefs against the inevitable mountain of resistance you will face for having ANY kind of stance or opinion.
It depresses people in large scale, it makes people behave weirdly.
And this was all happening before covid hit and just shook the damn soda can.
Reminds me of the expression: “it’s easier to imagine the end of the world than to imagine the end of capitalism.”
The current system is indeed slowly ripped apart by its own internal contradictions, just as all other systems in the past did, but the new system is not there yet. The in-between is always a confusing time while people try to cling to the old system like Stockholm syndrome to their captors. It’s only going to get worse. I can’t say I have any signs of a viable new systems appearing. There have been attempts, not nothing that can stand up to that resistance you mentioned.
the new system is not there yet.
I’m glad at least one other person gets it, where we’re at and what our actual situation is.
My worry is about how bad things are going to get before the “new system” begins to solidify. We have like, three or four different serious wildcards that are so unpredictable that I can’t fathom what the next four decades are going to look like. We’re about to see the fastest and most profound changes to society in all recorded history, but we still have brains that were developed in the Ice Age for surviving bears and wolves. We’re not the rational, thinking species we think we are and we’re about to collectively run headlong into that reality for the first time as a species.
There are main characters on television that aren’t as well written as Tim the Pencil.
::: Is just like… chat GPT gets sad when I insult it… idk what to make of that. spoiler
(Yeah I guess it’s based on texts and in many of those there would have been examples of people getting offended by insults blablablabla… but still.) :::
People have a way different idea about the current AI stuff and what it actually is than I do I guess. I use it at work to flesh out my statements of work and edit my documentation to be standardized and better with passive language. It is great at that and saves a lot of time. Strange people want it to be their girlfriend lol.
Just remember kids, do not under any circumstances anthropomorphize Larry Ellison.