16:55 |
Wordsmith Jarvinen |
Welcome, Starlight and TimLight. |
16:56 |
StarLight (jasmine.lordenwych) |
Good evening |
16:56 |
Wordsmith Jarvinen |
Hi, Val |
16:56 |
StarLight (jasmine.lordenwych) |
Hi Val |
16:56 |
Phrynne |
hi Val |
16:56 |
Phrynne |
hi Starlight |
16:56 |
Valibrarian Gregg |
Hello everyone! Lovely venue |
16:56 |
Wordsmith Jarvinen |
ty |
16:57 |
Lia (tokyorii) |
So it is impossible to run under it yet there is a gap |
16:57 |
Lia (tokyorii) |
Very weird |
16:57 |
Wordsmith Jarvinen |
Your bounding box may run into it Lia. |
16:57 |
Lia (tokyorii) |
Ohh |
16:58 |
Wordsmith Jarvinen |
The notecard giver is the figurine on the table in front of me. |
16:59 |
KayCooper |
ooooh everyone's still an orange cloud |
16:59 |
Valibrarian Gregg |
Thanks Wordsmith |
16:59 |
Wordsmith Jarvinen |
Hi Kay, Welcome. |
16:59 |
KayCooper |
Hi Word, Hi everyone :) |
17:00 |
Lia (tokyorii) |
Hiya! |
17:00 |
Wordsmith Jarvinen |
Lia, please find a place. I'm visually distractable. |
17:00 |
Lia (tokyorii) |
Sorry |
17:00 |
Lia (tokyorii) |
It was locking my POV from the left |
17:02 |
TimLight |
yes |
17:02 |
Valibrarian Gregg |
yes - sound good |
17:02 |
Wordsmith Jarvinen |
ty Still going to mostly use chat for the transcript. |
17:03 |
Valibrarian Gregg |
Great! ty for the notecard. This story was wonderful- short and powerful. |
17:03 |
KayCooper |
I really liked it |
17:03 |
Valibrarian Gregg |
Welcome Panny- we are just starting |
17:03 |
Valibrarian Gregg |
Take a notecard from the vase on the table :) |
17:04 |
Wordsmith Jarvinen |
It was indeed a powerful story |
17:04 |
Valibrarian Gregg |
For those that have not had a chance to read it- here's the link http://www.lightspeedmagazine.com/fiction/spider-the-artist/ |
17:04 |
Maia Antarra (psiberangel) |
Hi Everyone |
17:05 |
KayCooper |
Hi Maia :) |
17:05 |
Phrynne |
hi Mala |
17:05 |
Valibrarian Gregg |
I found it a currently relevant sci-fi story! Did anyone else? Written just a few years ago |
17:05 |
Lia (tokyorii) |
Hello Mala |
17:05 |
KayCooper |
I think it said 2008 or something like that |
17:06 |
KayCooper |
The moral of the story...befriend robots. |
17:06 |
Phrynne |
or befriend strangers. |
17:06 |
KayCooper |
Hi Laurali |
17:06 |
Wordsmith Jarvinen |
Well, maybe. Play music? |
17:06 |
Teixeira (boomer.teixeira) |
Someone said sci-fi and short story and I am supposed to be an author so here I am. |
17:06 |
Valibrarian Gregg |
published in 2011 |
17:06 |
KayCooper |
Befriend robots through music :) |
17:06 |
Valibrarian Gregg |
welcome Boomer! |
17:07 |
Wordsmith Jarvinen |
Notecard is in the figurine on the table in front of me. |
17:07 |
KayCooper |
Hi Teixeira |
17:07 |
Valibrarian Gregg |
The sci-fi author's style seemed super realistic to me. I could picture it easily. |
17:07 |
Wordsmith Jarvinen |
Includes a link to the story. |
17:07 |
Teixeira (boomer.teixeira) |
Hopefully I loaded in correctly for anyone and have nothing unfortunate floating around me. |
17:07 |
Wordsmith Jarvinen |
Yes, and she doesn't waste time or mince words. |
17:07 |
KayCooper |
You're all in one piece to me |
17:07 |
Valibrarian Gregg |
Music soothes the savage beast! |
17:08 |
Wordsmith Jarvinen |
And savage droid. |
17:08 |
Wordsmith Jarvinen |
In the notecard, I've basically tried to catch the major transitions points of the story. |
17:09 |
Wordsmith Jarvinen |
And let the discussion we have go naturally from there. |
17:10 |
Wordsmith Jarvinen |
So we get an immediate view of the protagonists life as well as the environmental damage her village is experiencing. |
17:11 |
Valibrarian Gregg |
The setting of the story is vividly depressing! the earth being sucked dry and the "zombie" robot spiders guarding the pipelines! |
17:11 |
Wordsmith Jarvinen |
Summarized with "My village was shit" |
17:11 |
Wordsmith Jarvinen |
And the use of past tense from the very beginning. |
17:11 |
Lia (tokyorii) |
Zombie robot spiders sound pretty cool to me. |
17:12 |
Wordsmith Jarvinen |
Depends if you are being torn apart. |
17:12 |
Lia (tokyorii) |
Oh.. |
17:12 |
Wordsmith Jarvinen |
Hi Nathan, Welcome |
17:12 |
Nathan Adored |
helllo |
17:12 |
KayCooper |
Hi Nathan :) |
17:12 |
Valibrarian Gregg |
Yes! the zombie robot spider that "befriends" her is super cool. I found tons of pics of spider robots online like this one https://3dexport.com/3dmodel-robot-spider-67899.htm |
17:13 |
Nathan Adored |
I apologist in advance if I'm a little distractificated.... I'm at two places at once, by way of an Emergency Backup Me alt.... |
17:13 |
Lia (tokyorii) |
That is awesome! |
17:13 |
Lia (tokyorii) |
The spider that is. |
17:13 |
Valibrarian Gregg |
But all the other spider robots are scary- especially when "red-eyed" angry/ |
17:14 |
Nathan Adored |
haha, that bot reminds me of a more cartoony-looking one I just bought as a character for use in Daz Studio / Poser. |
17:14 |
Valibrarian Gregg |
That's ok Nathan- Glad you could join us |
17:14 |
Wordsmith Jarvinen |
Any thought on the effectiveness of the start? |
17:14 |
KayCooper |
As someone who is not a fan of spiders, I always appreciate when a story convinces me to like something I otherwise wouldn't |
17:15 |
KayCooper |
I like that the start of the story just gets to the point. |
17:15 |
KayCooper |
It sets the stage without meandering all over the place and being confusing |
17:15 |
Valibrarian Gregg |
The story pulled me in immediately- with the zombie intro and the quick development of the character. The woman's life is just awful. |
17:16 |
Teixeira (boomer.teixeira) |
Kay, you might enjoy "Deepness in the Sky" |
17:16 |
KayCooper |
It does pull you in quickly, which is good, because my mind wanders lol |
17:16 |
KayCooper |
I will look into it Teixeira, thanks :) |
17:16 |
Nathan Adored |
so, is this event being done thru the streaming station, or in local chat? |
17:16 |
Wordsmith Jarvinen |
It grabs you before that wandering occurs. |
17:17 |
Valibrarian Gregg |
Nathan- we use both voice and text but nobody is speaking now |
17:17 |
Valibrarian Gregg |
welcome Morgaine |
17:17 |
Wordsmith Jarvinen |
In local chat, largely for getting the transcript, but voice is on and viable also. |
17:17 |
KayCooper |
Some scifi short stories throw a lot of detail at you really quickly, and that can make it difficult to get into the story |
17:17 |
Nathan Adored |
ah, so no need to turn on the streaming radio station thingy here |
17:18 |
KayCooper |
Hi Morgaine :) |
17:18 |
Wordsmith Jarvinen |
No, local chat works. |
17:18 |
Morgaine (morgaine.borgin) |
Hi |
17:19 |
Nathan Adored |
so, any published-ish writers here? |
17:19 |
Valibrarian Gregg |
I loved how the guitar was used as a link to a past when life was better (her father and grandfather) and also as an escape- with her music. |
17:19 |
Wordsmith Jarvinen |
So after setting the discord with her husband (abuse), the decaying village environment, see adds in that whatever the people's movement is they are getting picked off by "kill and go". |
17:20 |
KayCooper |
...and it smelled like trees, when there were no trees left :( |
17:21 |
Wordsmith Jarvinen |
Yes. There's that sudden transition to the guitar, its sensory attribute and into her father playing and her being the one interested in learning. |
17:21 |
Wordsmith Jarvinen |
A lot in a very short space. |
17:21 |
Valibrarian Gregg |
yes- the theme of artificial intelligence is subtle and not over the top |
17:21 |
Valibrarian Gregg |
That made it seem more real! |
17:22 |
Phrynne |
the eyes on her zombie - robot - were blue, liquid metal. I liked that detail. |
17:22 |
Valibrarian Gregg |
I could picture the liquid blue eyes of the spider who is entranced by her guitar music -- yes! |
17:22 |
Phrynne |
as opposed to the red eyes on the ones that guarded the pipeline. |
17:22 |
KayCooper |
I've noticed that the blue eyes vs red eyes thing has been used with robots before to denote safe mode and rage mode |
17:22 |
KayCooper |
I guess blue is a soothing color |
17:22 |
Valibrarian Gregg |
It made me think that the music calmed him. |
17:23 |
KayCooper |
Or he was different than the others... |
17:23 |
Phrynne |
That was a long piece she played -- 20 minutes. |
17:23 |
KayCooper |
...and that attracted him |
17:23 |
Valibrarian Gregg |
and they made a connection or bond through her music. He had a favorite song. |
17:23 |
KayCooper |
It's how they communicated |
17:23 |
KayCooper |
Communicating through music...that's just lovely |
17:23 |
Valibrarian Gregg |
yes- The first time she played it was frantic! She was terrified of the robot. |
17:23 |
Lia (tokyorii) |
Sometimes blue could also mean cold hearted, But blue has always looked more to the positive side of things, Even including video games, Stories, even the most modern things . |
17:24 |
Wordsmith Jarvinen |
Isn't it. And music does communicate. |
17:24 |
Teixeira (boomer.teixeira) |
Was never a fan of color coded light-up eyes or flashlight heads on robots myself. Visually stunning but ultimately unrealistic. The author using that for the visual shorthand of hostile and peace is an interesting choice in that it was selected to communicate the idea based on most reader's familiarity with it from film and video games. |
17:24 |
KayCooper |
I noticed they used the blue vs red in the Netflix adaptation of Lost in Space |
17:24 |
Valibrarian Gregg |
yes- Kay- the Lost in Space robot had those same colors.....I thought of that too |
17:24 |
|
Valibrarian Gregg thinks the Lost in Space robot was super cool |
17:25 |
KayCooper |
agreed |
17:25 |
Lia (tokyorii) |
Have you played team fortress 2, Same there too |
17:25 |
Wordsmith Jarvinen |
She takes it beyond just blue, however. Azure. |
17:25 |
KayCooper |
I have not, but yes, blue vs red seems to be a kind of accepted robot emotional standard |
17:25 |
Valibrarian Gregg |
I was so worried that the husband was going to destroy her guitar! or the robot was! But I guess there was more to worry about.... |
17:25 |
KayCooper |
...lately anyway |
17:25 |
Nathan Adored |
the LoS robot definitely spanned a lot of semi-lookalikes in later science fiction movies |
17:25 |
Nathan Adored |
spawned, even |
17:26 |
Teixeira (boomer.teixeira) |
Again, it's the light up eyes that are starting to lose me. Realistically something the eyes are cameras, not lights. |
17:26 |
KayCooper |
True |
17:26 |
Nathan Adored |
depends on how it works, I suppose. |
17:26 |
Valibrarian Gregg |
The story reminded me of the fear we read about- the Singularity OR the rise of some super intelligent AI with consciousness. |
17:27 |
Nathan Adored |
if its more like some sort of non-sweeping radar-ish thing instead of an actual camera, it could look like anything |
17:27 |
KayCooper |
"eyes are the window to the soul" I guess as humans we need to put eyes on things in order to feel like we can understand them or bond with them |
17:27 |
Teixeira (boomer.teixeira) |
Nathan, still a kind of camera. Still wouldn't be a lamp. |
17:28 |
Teixeira (boomer.teixeira) |
Kay, very yes. |
17:28 |
Nathan Adored |
still, I suppose they're taking their cues from Hal9000, who definitely had a glowing eye |
17:28 |
Teixeira (boomer.teixeira) |
Humans anthropomorphizing everything, especially when they are not trying to. |
17:28 |
|
Nathan Adored nods nods nods |
17:29 |
KayCooper |
I think we can't help ourselves |
17:29 |
Valibrarian Gregg |
I recently met a woman in her late 80s who got a PhD in AI back in 1960! She assured me that artificial intel is simply algorithms and could never develop consciousness. Fascinating discussion- but sci-fi stories LOVE a robot with emotions. |
17:29 |
Teixeira (boomer.teixeira) |
HAL had a red light on his monitors as an ON status indicator. Because that's a thing you do with electronics. As a result, future machines have glowing red eyes, when that was not HALs eye, |
17:29 |
Morgaine (morgaine.borgin) |
Never say never |
17:29 |
Nathan Adored |
well, that lady was clearly talking about expert-systems, which is a different kind of AI than one that would be designed to function like a human brain |
17:30 |
KayCooper |
Hi Elenabeth :) |
17:30 |
Valibrarian Gregg |
hehe true Morgaine |
17:30 |
Elenabeth Portal |
Hello |
17:30 |
Teixeira (boomer.teixeira) |
And that's another issue with AI, shoving in too many RAM cards will not unexpectedly endow a computer with human motives. |
17:30 |
Wordsmith Jarvinen |
Notecard in the figurine on the table in front of me. |
17:30 |
Valibrarian Gregg |
and I have read of computers composing music! right? |
17:30 |
Wordsmith Jarvinen |
Still rule based |
17:31 |
Valibrarian Gregg |
The spider robot composed beautiful music. |
17:31 |
Teixeira (boomer.teixeira) |
The AI in the story up for discussion though are not of "Too many RAM cards accidentally a person" variety. And I appreciate that. |
17:31 |
Wordsmith Jarvinen |
In response to her expression of sadness. |
17:32 |
Valibrarian Gregg |
yes- which makes us ask? Can human emotion be programmed? Social robots are now used in Japan as companions to the elderly or small children. |
17:32 |
Wordsmith Jarvinen |
And apparently moving beyond their original purpose. |
17:32 |
Nathan Adored |
Mind you, I liked the approach and explanation Asimov gave for his humaniform robots: they were trying to approximate humanlike thought as closely as possible, so they deliberately designed the humaniform robots to be as anatomically complete inside and our as humans, since they didn't want to take the chance tat some physiological element of the human body -- say, genitals -- NOT being in the robot might cause them to miss the target in reproducing human-like thought |
17:33 |
Nathan Adored |
*inside and out |
17:33 |
Elenabeth Portal |
I think most AIs in SF that "awaken" are not an example of "too many RAM cards; accidentally a person". Instead they are examples of "computer system massively complex beyond human ability to understand; accidentally a sentience" |
17:33 |
Lia (tokyorii) |
It can, It might take awhile but us as humans are very vulnerable. So I believe its quite possible |
17:34 |
Teixeira (boomer.teixeira) |
Right, it's important to remember the brain does not have an OS or run apps, it's an organ using chemical reactions and it's function is largely a part of its structure. To make a machine work like a human it would need to be a human, and for the same reasons. |
17:35 |
Teixeira (boomer.teixeira) |
Elenabeth, while that may be what they go for as the explanation, in most sci-fi what is still presented tends to suddenly have human motives regardless of origin, and the origin still translates to NX = Arbitrary number of RAM cards. |
17:35 |
Valibrarian Gregg |
Which brings us to ask- could an AI (highly developed sentient being) have a soul? What makes a soul? I think it is more than a brain. That gets into philosophy/spirituality more than science, I suppose |
17:35 |
Nathan Adored |
Anyway, there is a reason Commander Data on Star Trek (who was deliberately patterned after Asimovs humaniform robots, as a homage, right down to having a positronic brain) had a human-shaped brain in his artificial skull. |
17:35 |
Wordsmith Jarvinen |
I'm not sure that's entirely true, Boomer. It would need a brain similar to a human brain in function, but not necessarily be humanoid. |
17:36 |
Teixeira (boomer.teixeira) |
Data, on Star Trek of all things, had a very realistic structure and AI, while sharing the stage with the unrealistic AI that I often criticize. |
17:36 |
Lia (tokyorii) |
Wouldn't basically having a soul or shown to have one is by caring, compassion and other emotions? |
17:37 |
KayCooper |
Questioning what makes a robot human seems to really be wondering what makes us more than just a biological machine. |
17:37 |
Teixeira (boomer.teixeira) |
Wordsmith, agreed. |
17:38 |
Valibrarian Gregg |
Not sure Lia- because it is possible to program those empathy reaction in AI- and they would simply be "artificial"- there is a line between what is programmed and the huge leap to "sentient consciousness". Don't you think? |
17:38 |
Wordsmith Jarvinen |
And yes, the Zombie showed emotion in it's playing, and in saving her. |
17:38 |
Nathan Adored |
Teixeira: Meaning, you liked Data, but not the Enterprise's main computer? :D |
17:39 |
Teixeira (boomer.teixeira) |
Not some of the episodes involving the main computer. |
17:39 |
|
Nathan Adored gets a goofy, lopsided smile. |
17:39 |
KayCooper |
Hi Cherokee :) |
17:39 |
Cherokee (ak521pg) |
hi |
17:39 |
Elenabeth Portal |
Physicist Roger Penrose argued in his book "The Emperor's New Mind" argued that human like intelligence was an artifact of quantum effects in the microtubules of our nerve cells, and would never show up in computers. I think he's wrong, but he's clearly much smarter than I am. I don't think he understood how tiny computers would get and that they would eventually be subject to the same quantum effects as human neural cells. |
17:39 |
Nathan Adored |
What about the holodeck characters? Hehe.... some of those got a little TOO humanlike. |
17:39 |
|
Nathan Adored giggles and chortles |
17:39 |
Teixeira (boomer.teixeira) |
But yes, Moriarty is an example. In a world that has the main computer and Data, playing with the difficulty settings in a video game creates Moriarty |
17:39 |
Wordsmith Jarvinen |
Hi Cherokee. Notecard in the figurine on the table in front of me. |
17:40 |
Nathan Adored |
more humanlike than the people wanted, that occasion, anyway |
17:40 |
Valibrarian Gregg |
What do you think triggered the robots to become aggressive (move from reactive to doing a proactive sting)? |
17:41 |
Cherokee (ak521pg) |
ty Word ©©♠Smiles♠©© |
17:41 |
Lia (tokyorii) |
There is, But I also think that even if you program those emotions into a AI, It will not be the same as another AI , But as well ...Technically we do that as humans, From when were born we were told what to do, how to do it, where to do it, How to feel and if we didn't learn it from our parents, we learned it from those around who we care for |
17:41 |
KayCooper |
Maybe they realized it was a more effective way of preventing damage to the pipeline? |
17:41 |
Teixeira (boomer.teixeira) |
Part of what makes this story work is that it doesn't explain everything, which is always good and something Clarke argued for, sometimes in vain. |
17:41 |
Wordsmith Jarvinen |
Note the two extensions the droids have made. Building a pipeline segment from essentially scratch and shifting from reactive protection of the pipeline to essentially running a sting. |
17:42 |
KayCooper |
It seems like a logical escalation of what their original duties were. |
17:42 |
Lia (tokyorii) |
Its like the AIs being programmed to have a " soul " , Us as humans are programmed on a daily basis through media, music, even just reading a book |
17:43 |
Laurali Moon (lauralimoon) |
sorry... I got a call and was AFK. Just looking through recent chat... AI is quite an interesting topic. Was it Stephen Hawking that gave dire warnings about AI? |
17:43 |
Wordsmith Jarvinen |
It is, Kay, and similar to how humans extend current skills to learn new ones. |
17:44 |
Valibrarian Gregg |
Humans still have the capacity to CHOOSE what to think- robots do not. Anyone have an example that disproves? We may be "brainwashed" but that is always a terrible occurrence. |
17:45 |
Teixeira (boomer.teixeira) |
Almost every great thinker has given unheeded warnings about AI... |
17:45 |
Laurali Moon (lauralimoon) |
good point |
17:46 |
Lia (tokyorii) |
Agreed |
17:46 |
Elenabeth Portal |
How do you tell the difference between humans choosing what to think and humans thinking they are choosing what to think? |
17:46 |
Panny (panny.bakerly) |
Man's creation is killing him. |
17:46 |
KayCooper |
Agreed Elenabeth |
17:46 |
Maia Antarra (psiberangel) |
that brings up the concept of "free will"4 |
17:46 |
Nathan Adored |
The unfortunate thing is, some humans don't WANT to question what they've been consciously or unconsciously taught to think, when the reality around them doesn't match what they want it to be.... and they don't WANT to go check the facts and question their views.... Seen that one with my own eyes. 0o |
17:46 |
Valibrarian Gregg |
Good point Elenabeth! |
17:46 |
Teixeira (boomer.teixeira) |
People prefer the obvious danger, they expect something Skynet but even more human. The unobvious danger, that humans are the problem, we're playing with fire and it will have just as much motive and be just as much at fault as a fire, often falls on deaf ears. |
17:47 |
Valibrarian Gregg |
Anyone heard of Elon Musk? Currently warning about AI danger? |
17:47 |
Maia Antarra (psiberangel) |
yes |
17:47 |
Teixeira (boomer.teixeira) |
Yes. |
17:47 |
Valibrarian Gregg |
https://www.cnbc.com/2018/03/13/elon-musk-at-sxsw-a-i-is-more-dangerous-than-nuclear-weapons.html |
17:47 |
KayCooper |
I guess the robot/AI thing comes down to a kind of hubris...playing god |
17:47 |
Elenabeth Portal |
Yes. Warning about the dangers of AI is trendy these past few years. |
17:47 |
Elenabeth Portal |
It's a fashion thing. |
17:47 |
Lia (tokyorii) |
No , But I'm gonna look it up |
17:47 |
Elenabeth Portal |
Wait a while, and people will be on to the next big secular apocalypse |
17:48 |
Laurali Moon (lauralimoon) |
True, but AI is advancing very quickly |
17:48 |
Teixeira (boomer.teixeira) |
Technically speaking, the dangerous thing about AI is not the AI, but the people who ultimately control it. |
17:48 |
Valibrarian Gregg |
Wordsmith (from notecard): The protagonist protected by HER zombie. That personal relationship is crucial to the story. |
17:48 |
KayCooper |
People need something a little bit out there to be worried, but not really worried about to distract them from the scary stuff that is happening around them |
17:48 |
Panny (panny.bakerly) |
Or lose control of it |
17:48 |
Nathan Adored |
I have heard it said there are some people who want to create a sort of religion around a supposed giant AI thingy they plan to build.... of which the AI might not even be genuine, that is, it might be a fake run by the elites that would be building the AI-worshiping religion. oO |
17:48 |
Teixeira (boomer.teixeira) |
No no, no, having control of it is the bigger danger |
17:49 |
Maia Antarra (psiberangel) |
they are already with non profit status |
17:49 |
Elenabeth Portal |
What is the name of this group? |
17:49 |
Wordsmith Jarvinen |
Yes, they had bonded via music and the droid valued that relationship motivating it to protect her. |
17:50 |
Teixeira (boomer.teixeira) |
Maia, that is one of the dangers, AI might not be possible but we already have millions of people today that pretend things like Siri and Alexa are people when they are nothing more than a more complex menu system. |
17:50 |
Laurali Moon (lauralimoon) |
is the danger that humans might control it, or lose control and that logical thought will conclude humans are the problem and need to be eliminated, as even this story suggests |
17:50 |
Lia (tokyorii) |
Alexa and Siri are always listening |
17:50 |
Phrynne |
what about the ones that run automated house appliances? |
17:50 |
Maia Antarra (psiberangel) |
https://www.wired.com/story/anthony-levandowski-artificial-intelligence-religion/ |
17:50 |
Valibrarian Gregg |
A giant AI brain comes to mind- Like in Madeleine Lengle's A WRINKLE IN TIME classic children's book) |
17:50 |
Lia (tokyorii) |
Alexa does that I believe doesn't she? |
17:50 |
Phrynne |
And the possibility that the information they have could be tapped by others without your consent? |
17:50 |
Panny (panny.bakerly) |
I refuse to have that in my house. |
17:51 |
KayCooper |
Me too |
17:51 |
Nathan Adored |
yeah, that is one reason I don't want an Alexa sort of device in my house... What, me? Paranoid?!? Who wants to know!?? 0O |
17:51 |
Elenabeth Portal |
Thank you |
17:51 |
Maia Antarra (psiberangel) |
Alexa creeps me out |
17:51 |
KayCooper |
Google is stalking me enough as it is |
17:51 |
Lia (tokyorii) |
I have Google Home and Alexa, I noticed the more and more I would talk about games, or even just regular stuff , Google would show me things based on it or better yet "Refer" to me |
17:52 |
Cherokee (ak521pg) |
all computers can be hacked, |
17:52 |
Panny (panny.bakerly) |
It's bad enough folks don't secure or harden their devices. |
17:52 |
Teixeira (boomer.teixeira) |
The main reason all those big CEOs don't have these electronics in their house is not for social reasons or it makes their kids smarter, but because they are rather brazenly using it to spy on people and they don't want to be bugged, watched, and corralled the same way they do to others. Now... |
17:52 |
Maia Antarra (psiberangel) |
and did anyone see the videos on Alexa creepily laughing? |
17:52 |
KayCooper whispers |
it's always listening o.o |
17:52 |
Valibrarian Gregg |
Google just appeared at the Congressional Hearings....and I think we can all agree privacy died 10 years ago :) |
17:52 |
Teixeira (boomer.teixeira) |
...give these men an AI, so they can remove people from the equation in their control of people... |
17:52 |
Panny (panny.bakerly) |
Yes, Maia |
17:53 |
Valibrarian Gregg |
The Sci-Fi robots in stories are MUCH more personal than Alexi, Siri or Cortana. |
17:53 |
Wordsmith Jarvinen |
There are a couple of good short stories based on bad outcomes of AI, "The Last of the Wild Ones" by Zelazny and "Angel, Dark Angel", by Poul Anderson (relying on memory there). |
17:53 |
Laurali Moon (lauralimoon) |
I've long since realized privacy is an illusion... unless you are 100% off the grid, and even then... ? |
17:53 |
Lia (tokyorii) |
Also true Miss Greg |
17:53 |
Teixeira (boomer.teixeira) |
Imagine the first AI are controlled by Zuckerberg, bezos, Google... and as a result, all AI must descend from it or ultimately be influenced by it. |
17:53 |
Maia Antarra (psiberangel) |
not a happy concept for us |
17:54 |
Teixeira (boomer.teixeira) |
Spam, ads, in our brains, and across eternity, and we never, ever, create anything else. |
17:54 |
Valibrarian Gregg |
My 21 yr old nephew just asked me to help him get "off the grid"! That would be as hard as moving to Alaska and living off the land! watch INTO THE WILD I told him |
17:54 |
KayCooper |
Oh my, this got bleak |
17:54 |
Laurali Moon (lauralimoon) |
and if you're not familiar with it, read up on the concept of "the Internet of Things" |
17:54 |
Lia (tokyorii) whispers |
The Illuminati is always watching |
17:54 |
Wordsmith Jarvinen |
Generalizing way past the story now. |
17:54 |
Teixeira (boomer.teixeira) |
It's sad but the danger of AI may be moving towards the Frank Herbert version of the Butlerian Jihad |
17:54 |
Valibrarian Gregg |
I knew this book would spark a bleak discussion! hehe There is no going back. That is why Digital Citizenship has become my passion. |
17:55 |
Morgaine (morgaine.borgin) |
My husband was on 2 seasons of "Building Alaska" and Alaska Off the Grid |
17:55 |
Morgaine (morgaine.borgin) |
LOL |
17:55 |
Maia Antarra (psiberangel) |
cool Morgaine |
17:55 |
Morgaine (morgaine.borgin) |
speaking of off the grid |
17:55 |
Laurali Moon (lauralimoon) |
oh wow, Morgaine... that is cool |
17:55 |
Valibrarian Gregg |
I think it is possible to go off the grid- but not if you are in education! (There the grid is pretty mandatory) |
17:55 |
Wordsmith Jarvinen |
Coming up on 5 minutes. |
17:56 |
Lia (tokyorii) |
I'll just chill on the equinox |
17:56 |
Maia Antarra (psiberangel) |
supposedly the grid is going to be everywhere even there if some people have their way |
17:56 |
Teixeira (boomer.teixeira) |
Now, thankfully, it's not looking so bleak, AI like what they want is not possible and they will always need teams of people like me babysitting those computers... |
17:56 |
Valibrarian Gregg |
I really liked the choice this month Wordsmith! ty |
17:56 |
Teixeira (boomer.teixeira) |
Hello yes I am one of the people behind the curtain on AI |
17:56 |
Elenabeth Portal |
Herbert's Butlerian Jihad is not a danger of AI, Teixeira. It is an example of the danger of humans acting stupid in large groups. And that danger is something humans have dealt with for ages. |
17:56 |
PaulKrollWells |
Build libertarian colonies in remote areas and start a parallel society. |
17:57 |
Teixeira (boomer.teixeira) |
Elenabeth, exactly my point |
17:57 |
Valibrarian Gregg |
I'd like to see another story on this topic! |
17:57 |
Nathan Adored |
well, another thing some people are worried about is all those Internet of Things devices, a very large number of which either have poor security in them or none at all.... which could then be readily hacked or monitored by those we don't want having access to our lives. 0o |
17:57 |
Lia (tokyorii) |
Same! |
17:57 |
Wordsmith Jarvinen |
Which part of the topic, Val? |
17:57 |
Valibrarian Gregg |
and...btw- next month we meet over at the Community Virtual Library |
17:57 |
Laurali Moon (lauralimoon) |
Bingo Nathan! |
17:57 |
Valibrarian Gregg |
Artificial Intelligence I meant :) |
17:57 |
Lia (tokyorii) |
Paul did you just get here? |
17:57 |
Maia Antarra (psiberangel) |
and what is needed for the IOT is 5G and that ain't good |
17:57 |
Wordsmith Jarvinen |
Yes. Alternating venues. |
17:58 |
PaulKrollWells |
The society created by Zuck and friends may just fall apart. |
17:58 |
Nathan Adored |
MInd you, IoT is also a solution searching for a problem.... and doesn't seem to be taking off that quick... |
17:58 |
Valibrarian Gregg |
For notices- you might join the Second Life Library 2.0 group. Wordsmith is there another group here? |
17:58 |
Nathan Adored |
I mean, do we really NEED a networked, digital toilet or a networked toaster-oven? :D |
17:59 |
Maia Antarra (psiberangel) |
'lol |
17:59 |
Laurali Moon (lauralimoon) |
lol |
17:59 |
Panny (panny.bakerly) |
I'll hack your toilet and flush it when you least expect it. ;) |
17:59 |
Wordsmith Jarvinen |
There's COUG, the Oxbridge scholars group, and the main Caledon Group (Independent State of Caledon), that I announce to. |
17:59 |
Elenabeth Portal |
A networked digital toilet would allow you to quickly and easily post a collection of your "greatest shits" to instagram. |
17:59 |
Panny (panny.bakerly) |
LOL |
17:59 |
|
Nathan Adored giggles and chortles |
17:59 |
Maia Antarra (psiberangel) |
'lol |
17:59 |
Morgaine (morgaine.borgin) |
LOL |
17:59 |
PaulKrollWells |
We don't need any of that stuff it's all about control. Zeitgeist movement ideology. Which is really just communism. |
18:00 |
Lia (tokyorii) |
Lol hacking a toilet to just see what? the butt of someone? |
18:00 |
Valibrarian Gregg |
ok! It has been a pleasure tonight- great conversation! |
18:00 |
Panny (panny.bakerly) |
lol |
18:00 |
Maia Antarra (psiberangel) |
and it went straight to the toilet |
18:00 |
KayCooper |
lol |
18:00 |
Panny (panny.bakerly) |
yes |
18:00 |
Teixeira (boomer.teixeira) |
The bad 1984 communism, not the good "12 people sharing 12 acres of land in an otherwise capitalist system" kind. |
18:00 |
Panny (panny.bakerly) |
p |
18:00 |
Teixeira (boomer.teixeira) |
You know, a commune, hence the name |
18:00 |
Nathan Adored |
well, all those IoT devices could also be combined by hackers into a DoS-attack service |
18:00 |
Wordsmith Jarvinen |
I've got the last transcript and this one to post, which I will do. |
18:00 |
Panny (panny.bakerly) |
Night |
18:00 |
Lia (tokyorii) |
Goodnight! |
18:00 |
Maia Antarra (psiberangel) |
Thanks interesting discussion |
18:00 |
PaulKrollWells |
THey want to control every thing we use , monitor us our habits , our health. People farming for cash. |
18:01 |
Lia (tokyorii) |
Thank you for the conversation everyone |
18:01 |
Elenabeth Portal |
The difference is one is voluntary; the other is forced. |
18:01 |
KayCooper |
Great story choice |
18:01 |
Wordsmith Jarvinen |
http://www.caledonoxbridge.org/cvl_caledon/index.php |
18:01 |
Laurali Moon (lauralimoon) |
Interesting story and discussion |
18:01 |
Maia Antarra (psiberangel) |
yep and we would be just commodities to be exploited |
18:01 |
Morgaine (morgaine.borgin) |
Thank you all. I've learned that I have to go reread everything. |
18:01 |
Lia (tokyorii) |
Um |
18:01 |
PaulKrollWells |
They will slowly work this garbage in "for the good of society" |
18:01 |
Valibrarian Gregg |
I'm heading out. I hope to see you all again next month! |
18:01 |
Wordsmith Jarvinen |
There's also a button on the main Oxbridge website navigation menu |
18:01 |
Maia Antarra (psiberangel) |
or for the children |
18:01 |
Valibrarian Gregg |
Happy Holidays! |
18:01 |
Teixeira (boomer.teixeira) |
What's worse, many humans want to be in power and exploit and harm for the mere sake of it. |
18:02 |
Maia Antarra (psiberangel) |
they would be psychopaths |
18:02 |
PaulKrollWells |
Our best hope is that they self destruct first. |
18:02 |
Maia Antarra (psiberangel) |
some already did |
18:02 |
Teixeira (boomer.teixeira) |
1984 is a warning of those in power possibly having the most depraved of human natures. Telling other people "you can't" just to do it |
18:02 |
Phrynne |
goodbye for now. |
18:02 |
Teixeira (boomer.teixeira) |
Oh no, not psychopaths |
18:02 |
Maia Antarra (psiberangel) |
bye |
18:02 |
Teixeira (boomer.teixeira) |
The true worst of humanity is considered normal by the majority, despite the fact that they are not |
18:02 |
Wordsmith Jarvinen |
Thank you folks for coming and participating. |
18:02 |
Laurali Moon (lauralimoon) |
so AI is right! humans are the problem and should be eliminated.... urg! |
18:02 |
KayCooper |
Thanks Word :) |
18:02 |
Maia Antarra (psiberangel) |
well some humans |
18:02 |
Nathan Adored |
yeah, there's a crossroads right now.... an active battle going on behind the scenes between those who want an authoritarian, centrally-controlled society, and those who want a liberty-minded, more decentralized, Libertarian-ish society. |
18:02 |
Teixeira (boomer.teixeira) |
"Just avoid them" "kind of a jerk" "rules are rules" |
18:03 |
PaulKrollWells |
Just read the Gulag Archipelago. Or listen to it while you drive. It's on youtube. |
18:03 |
Teixeira (boomer.teixeira) |
and next thing you know you thought you were lowering taxes but you have concentration camps in your nation for the third time in history |
18:03 |
Maia Antarra (psiberangel) |
I have read on this quite a bit |
18:03 |
Teixeira (boomer.teixeira) |
The THIRD time, concentration camps... seriously |
18:03 |
KayCooper |
Great discussion. Bye everyone :) |
18:03 |
Cherokee (ak521pg) |
well we could all just go back to the stone age ©©♠Smiles♠©© |
18:04 |
Laurali Moon (lauralimoon) |
bye Kay |
18:04 |
Maia Antarra (psiberangel) |
or fun camps as one politician put it |
18:04 |
PaulKrollWells |
It all goes to global elites leaching off of the masses |
18:04 |
Maia Antarra (psiberangel) |
yep |
18:04 |
Teixeira (boomer.teixeira) |
We can;t go back to the stone age. |
18:04 |
Laurali Moon (lauralimoon) |
History just keeps repeating itself... we have very short memories |
18:04 |
Maia Antarra (psiberangel) |
or the dark ages |
18:04 |
PaulKrollWells |
Turning cities and nations into confined animal feeding operations with people instead of cows. |
18:04 |
Teixeira (boomer.teixeira) |
Furthest back we can go is a Horizon Zero Dawn style bronze age with steampunk elements |
18:04 |
Nathan Adored |
That said, I've been reading a bunch of books on "What went wrong with the society and governments of today and how can we fix it" sorts of books, because I'm planning a novel set in the 2080s... and I want to plausibly craft a more POSITIVE future in it |