• The site has now migrated to Xenforo 2. If you see any issues with the forum operation, please post them in the feedback thread.
  • Due to issues with external spam filters, QQ is currently unable to send any mail to Microsoft E-mail addresses. This includes any account at live.com, hotmail.com or msn.com. Signing up to the forum with one of these addresses will result in your verification E-mail never arriving. For best results, please use a different E-mail provider for your QQ address.
  • For prospective new members, a word of warning: don't use common names like Dennis, Simon, or Kenny if you decide to create an account. Spammers have used them all before you and gotten those names flagged in the anti-spam databases. Your account registration will be rejected because of it.
  • Since it has happened MULTIPLE times now, I want to be very clear about this. You do not get to abandon an account and create a new one. You do not get to pass an account to someone else and create a new one. If you do so anyway, you will be banned for creating sockpuppets.
  • Due to the actions of particularly persistent spammers and trolls, we will be banning disposable email addresses from today onward.
  • The rules regarding NSFW links have been updated. See here for details.

Amelia, Worm AU [Complete]

Also... did you really have to invoke THAT retard for your links? Honestly? You couldn't have used anyone else? Ellison? Asimov? Clarke? Y'know, someone with talent?

I just lost a lot of respect for you. Yudkowsky's talent for fiction is irrelevant in terms of usefulness for discussing AI.
 
This is what it looks like when one fanfiction author is jealous that someone else completed a story.

Authoring fanfiction.

Not even once.

Uh, no. I don't think that's jealousy on Tananari's part. I got to around the chapter of Less Wrong's Harry Potter fic where Professor Quirrel sets the students in his class against each in in a 'simulated' warfare demonstration, before realizing I just couldn't stomach the story anymore. I've read stories most people simply can't shut up about their hatred, including all of Skysaber / Perfect Lionheart's published stories, but I couldn't get I think even a third of the way through HP and the Methods of Rationality. It just... lacks... something human. The emotions shown by the characters are pretty flat most of the time and the 'logic' shown is sometimes inconsistent and self-contradictory. Just... no. I don't know how the guy is as a scientist or researcher or whatever applies, but his author's notes made him come across as an ass with a huge ego. I could perfectly understand Tananari not wanting to talk about him.
 
Yudkowsky's talent for fiction is irrelevant in terms of usefulness for discussing AI.
Really? The intelligence and talent of the person discussing a concept has no bearing on the value of his contribution to the discussion?

I... think there's something you're missing here.

I'm not saying his fanfic is bad. I have no ways of knowing that one way or another, as I've never read it- I'm not a Harry Potter fan, why would I read fanfic about it?

I am, however, aware of the whole "Less Wrong" idiocy- that kind of stuff is something I take the time to study (I also studied up on Scientology... which I actually find slightly less creepy than Yudkowsky's "ideas").

I'm saying that the guy doesn't know what he's talking about, and if someone wants to bring up other people to carry speculation on AIs, then he should use someone whose mind actually has *merits* for the subject. Clarke, especially. He invented the geosynchronous satellite, long before anyone bothered to build one- no one wanted to since they thought it would be a waste of money, y'see.

I don't know how the guy is as a scientist or researcher or whatever applies
He's not. He has exactly no formal qualifications as a scholar or scientist. He's an intellectual in the same way Doctor Phil is a medical practitioner.
 
Last edited:
I should probably qualify that. Just so we're clear, Doctor Phil still has a PhD as a psychologist. He has earned the right to call himself a doctor. It's just, he's been known to give health advice he is not qualified to give.

And I don't personally believe being formally recognized is necessarily a measure of ability or knowledge, either. Plenty of true geniuses out there with incomplete or even zero formal education.

And on the opposite side of the scale... George Bush Jr. graduated from Yale AND Harvard. I don't want to turn this into a discussion on politics, and personally I think he gets a worse rap than he deserves, but a giant of intellect, the man is not. Despite having top tier formal qualifications.
 
Huh. The information I could find on Dr. Phil was divided, but most of it made it sound like he didn't have a licence to practice medicine and wasn't a psychologist. Of course, most of those articles were dated several years ago, dating back to a mess with Britney Spears apparently... Still figuring how right or wrong Less Wrong is would easier if if he gave some indication of his credentials, but what I could find on his website didn't indicate anything either way.
 
Huh. The information I could find on Dr. Phil was divided, but most of it made it sound like he didn't have a licence to practice medicine and wasn't a psychologist. Of course, most of those articles were dated several years ago, dating back to a mess with Britney Spears apparently... Still figuring how right or wrong Less Wrong is would easier if if he gave some indication of his credentials, but what I could find on his website didn't indicate anything either way.

Psychologists aren't medical doctors; they don't practice medicine. Originally, they could not prescribe medicines, though now in some states, they can (possibly with supervision by *psychiatrists* who finished medical school + residency + I would suspect, had to pass stricter licensure exams).

This primary difference in the power to prescribe has resulted in the tendency that psychologists are focused on cognitive therapy (because they either can't prescribe medicines or have less background related to medicines), and psychiatrists tend to be focused on treating brain chemistry (because for the longest time, only they could give the drugs for it), but there is more crossover now.
 
The information I could find on Dr. Phil was divided, but most of it made it sound like he didn't have a licence to practice medicine and wasn't a psychologist.
http://en.wikipedia.org/wiki/Phil_McGraw#Education_and_career

Ah, wiki. How I love thee for making my job easier.

Like I said- whether having the education means much is an entirely different subject. Point is, the man does have it.

Still figuring how right or wrong Less Wrong is would easier if if he gave some indication of his credentials, but what I could find on his website didn't indicate anything either way.
Credentials couldn't hurt, certainly. Treating his so-called "research group" as something other than a cult would ALSO be welcome.

Psychologists aren't medical doctors
There is also this. Psychology is actually treated in the community as being more in the humanities, much like sociology... certainly not seen as medicine. Although that differs between research psych, clinical psych, and behaviorist psych. They're VERY different from one another.

but there is more crossover now.
Pretty much, modern health is starting to push the concept of overall wellness, both mental and physical. Still a ways to go, though.
 
Have that person explain my posts, as well.
I could use an explanation, myself.

I just went with the basic assumption that since almost everything you say is sarcastic, that probably was as well. And there may have been a joke I didn't get... that it was an attempt at humor on your part that was slightly off target.
 
"What?" Zach acted shocked and hurt. My software kindly informed me this was a lie, part of whatever act these children were putting on. "It's a perfectly valid point! At least I didn't ask him what robot poontang was like. That'd just be crass."
Quick, Vicky, call up your mom. Someone's stealing the best lines from you!
 
Then you're not trying hard enough. I can think of several.
Yes. And you'll note the conspicuous lack of 'Verified dick' in my membery-avatary-box-thingy.

...I might not know what they're called, but I don't care so it balances out.


I'm not saying his fanfic is bad
It's... not bad. I've read the whole thing (and I'm not using this as an excuse to tell you to read it because you already said it isn't your thing, which is why I'm not saying it was good, just that it wasn't bad). It's pretty well written, though an obvious author tract in parts, but my method of reading ('Hmm, this part looks like it's completely useless to the story or my enjoyment of it and oh shit I hit page down') coupled with my ability to just not care about things works well with a lot of things.

But I had issues with the end where he basically told his readers: Hey, I want you to work out how Harry gets out of this predicament I've thrown him into, otherwise sucks to be him and everyone else.

Fuck that. I don't read to be told to think about things a specific way or to be asked what I think is going to happen next*. I read to immerse myself in worlds other people create. Whether it was the right or wrong thing to do (and I'll admit, reader participation is not an inherently bad thing, it's just not what I want) isn't my place to say, it was just wrong for me, had no warning whatsoever, and the whole thing was posted on a site without a forum to actually discuss things so what's the point of having one single instance of it?

Also he then went on to ignore basic human greed, but I can't work out if that's a good thing or a bad thing since the PoV character is a 10-12 year old kid so... bit hard to tell.

*I am aware I am 'participating' in this thread. It's really not the same thing, in case you're wondering.
 
I could use an explanation, myself.

I just went with the basic assumption that since almost everything you say is sarcastic, that probably was as well. And there may have been a joke I didn't get... that it was an attempt at humor on your part that was slightly off target.
He has done at least one undeniably admirable thing: he finished his fanfiction. That's not technically a joke, but it may be humorous as contrast to the preceding conversation.

I don't have a strong personal opinion about the guy. The few times that I remember following links to his definitions, they seemed decently well done and generally unobjectionable, but that's a small sample and I honestly don't recall exactly what they were about.

I guess I don't get why you were so upset, so I tried to defuse the tension. Maybe I was wrong to do so: has he done some kind of harm to the causes which he purports to champion?
 
I'm saying that the guy doesn't know what he's talking about, and if someone wants to bring up other people to carry speculation on AIs, then he should use someone whose mind actually has *merits* for the subject.

I've only brought it up because of the similarity between Yudkowsky's AI box scenario and Dragon manipulating Pantheon.

I didn't quite expect to summon Dr. Phil's PhD. Sorry.
 
I guess I don't get why you were so upset, so I tried to defuse the tension.
Ah. You made the mistake of thinking I was upset. I do not get why that happens. I mean, I know I use a fair amount of passionate language, but it's just the internet... the only time I've ever actually been emotional thanks to the internet was back in highschool when I stumbled across child pornography. Yeah. THAT upset me. This is not that.

... I'd probably have more moments of net rage if I went on to holocaust denial websites or faces of death or whatever (if that even still exists)... but I tend to avoid the parts of the internet that might actually piss me off. Enough actual real life issues. Bills to pay, asshole coworkers. Younger brother forced to live with my drug addict mother and her deadbeat boyfriend of the week. Younger sister who's setting herself up to be a copy of said mother. Y'know, REAL things to care about.

Maybe I was wrong to do so: has he done some kind of harm to the causes which he purports to champion?
On a level, I think he does. He draws on donations that might have gone to legitimate research or scholar groups. But that's a pretty abstract kind of harm. I'd have to put more effort into researching him to form a true opinion (and I won't because he's not that important to me)... but to my knowledge he's done nothing I find legitimately harmful.

However, he's still an idiot and I reserve the right to mock him and those who support him. As I reserve the right to mock Flat Earthers and Beliebers. I don't find *them* morally offensive, either. Just stupid. Or possibly the most expert trolls of all human history. Poe's Law comes into effect.
 
Last edited:
I speak form experience, Yudkowsky's work is best read if you are somewhere on the autistic spectrum. I personally think he is somewhere on the spectrum (likely the spot aspergers previously occupied) and is undiagnosed or merely un-self aware. He just write in a way that makes me think he processes emotion in the same way high functioning autistic do. He also tends to act with the arrogance of someone who does not realize that the rest of the world does not think the way he does (a common problem with aspergers). I can see why people read his work and feel "all the character's are off emotionally" and "Everyone seems flat". On the other hand, you have people like me who read his work and think "This person sees the world though my eyes".

I don't have a good way to describe it. It's just something in the inner voice of the various characters, and the way they all feel so ofset in their social contact. Aspergers is all about having to be reminded that everyone around you are, infact, people and not robots following a script. I get that feel from his writing. I can see how that just doesn't connect to many people.

Now, I agree, his charity was stupid. Never donate to ANYTHING unless you get a look at the financial statement to see how much actually goes to charity. Not sure that it's malicious though, even a good charity can find itself in horrid management.
 
Well that was a waste of time:

Tried to figure out about things being mentioned > 'AI box experiment sounds interesting' > Try to find out about the experiment > Many links > 'No, not giving out any details'

Bit disappointing...
 
"Know how humans only use ten percent of their brain?"
You piece of-

Oh, okay. Crisis averted.

I think on some level he was building her with the intent to turn her into a weapon. I've found aggression simulation that mimic parahuman conflict impulses. Plus a dozen other pleasure impulses. It's like she has a combat Thinker secondary power."
And Emma totally misinterprets Dragon's trigger event as a normal part of her code. What a goof, eh?

Now that Lisa's gone, I'm curious about how Pantheon will develop with Dragon filling in. She ought to be able to roughly do the job, maybe even be better at it, in addition to multitasking over huge chunks of the planet. Infowar was never this easy.
 
Well that was a waste of time:

Tried to figure out about things being mentioned > 'AI box experiment sounds interesting' > Try to find out about the experiment > Many links > 'No, not giving out any details'

Bit disappointing...

The general concept is that an AI in a box will lie and say anything to be free. Humans are really bad at not applying human emotions to everything, an are very likely to fall for releasing an AI that is willing to lie. Basically it is an open thought experiment that asks the question "You have an AI in a box. It may or may not be able to lie to you. It may or may not be evil. Once it is let out of the box, you cannot stop it. What do you do?"

Answers run the field from "Nuke the box" to "Trust it completely and open it". As far as I know, there is no good answer to safely open the box. The theory is that unless we find said answer, we cannot trust an AI to access anything other than a test computer.
 
Ah. You made the mistake of thinking I was upset. I do not get why that happens.
It's because your speech patterns change from stuff like this, to sounding like someone who is upset.

On a level, I think he does. He draws on donations that might have gone to legitimate research or scholar groups. But that's a pretty abstract kind of harm. I'd have to put more effort into researching him to form a true opinion (and I won't because he's not that important to me)... but to my knowledge he's done nothing I find legitimately harmful.
Now, I agree, his charity was stupid. Never donate to ANYTHING unless you get a look at the financial statement to see how much actually goes to charity. Not sure that it's malicious though, even a good charity can find itself in horrid management.
Meh, when there are real people out there who make a living as a Financial Dominatrix, I just can't care if some guy is using donations to do derpy science.

And Emma totally misinterprets Dragon's trigger event as a normal part of her code. What a goof, eh?
Maybe tracing the Entity tentacle will be easier if it's bad-touching an AI instead of a person? I mean, either the code or the hardware has to be Shard-molested in a measurable way, or Dragon's emotions wouldn't be modified.

If Emma shuts off the Shard influence, will Dragon's Tinker tricks disappear? (Not like she actually needs Tinker tricks if she's unshackled.)

Now that Lisa's gone, I'm curious about how Pantheon will develop with Dragon filling in. She ought to be able to roughly do the job, maybe even be better at it, in addition to multitasking over huge chunks of the planet. Infowar was never this easy.
Dragon: "I am the Web."

Taylor: "I spun the web!"

Amelia: "Taylor, hon..."
 
Bit disappointing...
Sounds about par for the course.


Oh, okay. Crisis averted.
Nailed it!

Maybe tracing the Entity tentacle will be easier if it's bad-touching an AI instead of a person? I mean, either the code or the hardware has to be Shard-molested in a measurable way, or Dragon's emotions wouldn't be modified.

If Emma shuts off the Shard influence, will Dragon's Tinker tricks disappear? (Not like she actually needs Tinker tricks if she's unshackled.)
WoG is Dragon's Trigger was for a Thinker power, not a Tinker one. What that power was, never elaborated upon. But an ability to think in ways to weave around her own restrictions seems to be a good place to start.

Also- they have over a hundred and fifty artificial triggers brainmapped before and after. Plus one natural Trigger thanks to clone!Vicky.

I just can't care if some guy is using donations to do derpy science.
Yeah, pretty much. Like I said, I just think he's dumb as hell and berate those who reference him for anything scientific or intellectual. Reference him as a fanfic writer, and my opinion is "don't know, don't honestly care, I don't read Harry Potter".
 
Now that Lisa's gone, I'm curious about how Pantheon will develop with Dragon filling in. She ought to be able to roughly do the job, maybe even be better at it, in addition to multitasking over huge chunks of the planet. Infowar was never this easy.
Honestly, Dragon is probably much more stable then Lisa ever was. Less prone to pissing contest too, so I could see her as being an improvement.
 
Isn't what I was looking for. I wanted to read through how it happened.


As far as I know, there is no good answer to safely open the box.
If you can create an AI, I'd assume (I'm totally an expert on these things by the way. You can trust me. I'm also not an evil AI) you could create a virtual environment that emulates the environment the AI would get to (especially since it's had zero previous experience with this) if you did 'open the box'. So... let it out into that, leave it running on 100x speed for a couple decades, see what happens?

It's not, strictly speaking, opening the box. But if the AI is allowed to lie, why cant the user?
 
Basically it is an open thought experiment that asks the question "You have an AI in a box. It may or may not be able to lie to you. It may or may not be evil. Once it is let out of the box, you cannot stop it. What do you do?"

It's a bit more than that. The core idea is that a sufficiently smart AI will find a way to convince a human gatekeeper, so if you want it contained, you can't even communicate with it. For example, it can make a credible offer to cure cancer, end world hunger or solve the oil crisis. And it won't matter what you previously thought or believed, it will figure out something to offer you that you won't want to miss out on.

Like Dragon these last few chapters, just without the space whale magic to throw its predictions off.
 
If you can create an AI, I'd assume (I'm totally an expert on these things by the way. You can trust me. I'm also not an evil AI) you could create a virtual environment that emulates the environment the AI would get to (especially since it's had zero previous experience with this) if you did 'open the box'. So... let it out into that, leave it running on 100x speed for a couple decades, see what happens?

It's not, strictly speaking, opening the box. But if the AI is allowed to lie, why cant the user?

Seems like it simply displaces the problem onto 'emulate the environment accurately enough'; since if you haven't emulated the environment to a high standard, how do you even know what you have proved when you run the AI in it?

To me, making an accurate (and computationally feasible) emulation of the environment seems like it has to be orders of magnitude more complex than the (far from simple) task of making a true AI, simply because of scope.
 
Amelia, Ch 363- Crystal
Amelia, Ch 363- Crystal


Amy walked over to us slowly, as if afraid that we might might bolt from sudden movement like a house of cards. She might not be wrong about that. "Umm, I just wanted to let you know we're packing up. We'll be keeping Anima for observation overnight. I've done everything to make sure her body's in perfect shape. What today means for her mind and her powers is still unknown. I promise you Clarice and Elena are the best possible people to look after her."


When did it come to pass that Bonesaw was the best person to look after anyone you cared for? I looked at the rest of my team. They were broken. I was broken. Lily was offering what comfort she could to Sabah. Dubstep was quiet, looking at the horizen as if he was looking for an answer that wouldn't come. Even Boost was quiet. This was not a place for bravado, I was glad he respected that. "What about Genius Loci?"


Amy looked down. "Nowhere to be found. Clarice, Emma and Elena... uh, our experts on how powers work, suspect he was... there's no easy way to put this... he was absorbed in the power interaction. Destroyed by it."


Used as raw material, I added bitterly inside the privacy of my own thoughts. I would never say it out loud, none of us would.


"No!" Sabah exclaimed angrily, jumping to her feet and dragging a very surprised Lily with her. "That's not right! Beth's power can't be responsible!" She broke down into tears, turning to cry into Lily's chest.


I forced myself to take a breath. I need to be strong, I can not be the one who cries. Do not let them see you fall apart, or they'll all lose it. "She's right, it can't. Beth's always been fragile. There has to be another reason, any other reason." It was a command and a plea, as if the universe would obey my words because the alternative was too horrible to imagine being possible.


"Th- the barrier, between dimensions," Sabah offered, her voice every bit as desperate as mine felt. "Citrine said-" She folded, unable to keep herself together any longer.


I looked over at Dubstep. "Is that true, did Citrine think the barrier could be dangerous?" He nodded, but didn't say anything. In a way, he reminded me of Amy. Or of who Amy was before she snapped. But this did give me a way out.


"Then that's what happened." I gave Amy a look I hoped was commanding and meaningful, instead of merely desperate and pathetic. "Citrine was right, and the barrier itself is at fault, not Beth. I don't want to hear anyone ever say anything to the contrary, understood."


Amy hesitated for a moment. "Right, I'll go tell Elena and the others about this... umm... new information." She didn't believe anything I said, but at least she was going to spread the news. Shielding Beth from this was... fuck, GL wouldn't want her feeling guilty over what happened. None of this was her fault, and she didn't need any more reasons to hate her powers.


Amy waited for another minute. "I also wanted to let you know that Lisa's already resigned." For a brief moment I was pulled out of my grief by the surprise. I'd expected her to get in trouble, but fired? Sure, she was a bitch, but she wasn't responsible for the worst of this. I didn't care enough to ask for more.


"Resigned, sure," Lily's voice was bitter, cold.


Amy looked at her, but didn't take the bait. "Dragon will be taking over her responsibilities, at least for the near future." The idea of having Dragon planning for me sounded wonderful. "Lily, would you please stay here for the next few days? The Elite are going to want to retaliate, and with the LA team down two members, they'll probably see it as a weakness to exploit. Hopefully having you in the area will ward them off. We'll fabricate some excuse, probably to do with meeting with some our incoming colonists. Something visible, whatever it ends up being. I have Janus and Victoria ready for emergency alert, so you will have backup in minutes of calling for it. And Janus will be able to quickly take you to Europe for creating new portals and then send you back to LA. I know it's a lot to ask, can you handle it?"


Smart, sending Lily back to work while Sabah's in this condition and Beth's disabled would have made things worse. It would just breed resentment at a time when we can't afford it. I wonder if Lily realizes this, or is simply relieved to be ordered to do what she wanted anyway. "Yeah, I can handle that."


Having Vicky on call is a huge relief, too, since there aren't any political barriers and she's almost Triumvirate tier powerful.


Amy accepted her answer at face value. "And for what it's worth, I'm sorry. Fuck, that sounds so lame. I'm not good at speeches and trying to make people feel better. Just know that if you need it, we're here. Maybe we can help plan a memorial." She turned and walked toward where our Tinkers were waiting. That's Amy for you, always doing things in person instead of using the com systems.


Lily looked over at me. "So, umm, will there be room enough for me to crash for a while, or should I look into renting an apartment?"


Is she not planning to stay with Sabah? Whatever, not my business. As much as I viewed Lily as an overall liability for the team, she was possibly the single most intimidating person on the planet and we could really use that. "Sure, we've got a few spare rooms. They're not much, but at least you'll have a bed."


....


I couldn't let myself cry in front of the others, but once I was alone in my office, I made up for lost time. It wasn't fucking fair! He shouldn't have been there. The risks from that damn dimension barrier meant I never would have authorized him to be that close. The fact that it wasn't the barrier that killed him, that it would only have been luck which saved him, meant nothing. If I was there, then Glen would be alive right now. That's what mattered. Citrine allowed him to stay because he was another possibly useful resource to her instead of a person.


I yawned and blinked my eyes. Fuck, I should at least try to get some sleep.


For a moment, I was confused when the door to leave my office didn't open automatically as I got up to leave. He's really not here anymore. I'd come to start thinking of this building as an extension of his mind, gotten used to the rooms being functionally alive, with an omnipresent consciousness there to anticipate and take care of my needs. Maybe living in what Zach named the Magic Treehouse, with Amy to be the rooms and Taylor to be the consciousness had prepared me for it, but they were always separate and... impersonal wasn't the right word, but they never used the environment to interact with anyone. Glen was the environment, it was his only means of interaction.


"You miss him too, huh?" I looked around for the voice. Derek was sitting there on the couch. Outside of his costume, or in it for that matter, the seventeen year old was smaller than most. A little on the girly side, I might even have found him attractive if he were older and had more confidence. But his personality was too meek, and he had ways of making himself seem even smaller and younger than he already was. I often wondered what made him behave that way, but then I used to wonder the same about Amy, until she suddenly didn't anymore.


I sat down next to him. "Is it that obvious?"


He forced a smile, and I actively ignored his bloodshot eyes. He extended the same favor to me, thankfully. "I just spent fifteen minutes looking for a remote that was sitting on the back of the couch the whole time."


I smiled back. "Glen always was good for that." His power included proprioception of the area he was in. The idea that he could basically feel every curve of every body in the area with him was profoundly creepy. Until I reminded myself of just how incomprehensibly intimate the knowledge Amy's power gave happened to be. The fact that Amy got no form of voyeuristic enjoyment from her power was a relief. The fact that the same was true of Glen was... a small tragedy in its own right. He was always blocked from true touch by a Manton Effect slightly thicker than a piece of construction paper.


"Do you think Beth is going to be okay?"


I swallowed reflexively, fighting away the urge to cry again. I lose the one member on my fucking team that can't be restored from backup. He was supposed to be unkillable, and now he's gone forever. I forced myself to focus on what Derek said. Right, he's worried about Beth. "They said she'd need time to recover. We'll just have to hope that is enough."


"Do... do we hold a funeral for him?"


"He'd hate every second of it." He was the type that wanted to be strong for others, the one that helped me when I was down. The reason I carried the job on this long was because he was a crutch I could lean on. Now the fuck was I supposed to do? "We'd all stand around, tell stories and talk about how we miss him while looking at a stone which doesn't even have a date of birth, sitting on a patch of land which has no casket."


"That means we're holding a funeral, doesn't it?"


I nodded. "Yeah, that means we're holding a funeral. We'll put that off until Anima recovers. She was his friend, too."


===================


A/N- Umm... good news for Zach/Crystal shippers... Emma can't be backed up, either.
 
Last edited:
Seems like it simply displaces the problem onto 'emulate the environment accurately enough'; since if you haven't emulated the environment to a high standard, how do you even know what you have proved when you run the AI in it?

To me, making an accurate (and computationally feasible) emulation of the environment seems like it has to be orders of magnitude more complex than the (far from simple) task of making a true AI, simply because of scope.

And the AI can lie. How do you know it doesn't know it is in a simulation and just pretending to be benevolent?
 
Seems like it simply displaces the problem onto 'emulate the environment accurately enough'; since if you haven't emulated the environment to a high standard, how do you even know what you have proved when you run the AI in it?

To me, making an accurate (and computationally feasible) emulation of the environment seems like it has to be orders of magnitude more complex than the (far from simple) task of making a true AI, simply because of scope.
yeah. it kinda sucks that the best solution for the problem that I've seen is to create a simulation of reality comprehensive enough to hold up to millennia of study and exploration.
 
Seems like it simply displaces the problem onto 'emulate the environment accurately enough'; since if you haven't emulated the environment to a high standard, how do you even know what you have proved when you run the AI in it?
Make another AI to test it! Might work, might not. Run a few of them through the environments you make.

To me, making an accurate (and computationally feasible) emulation of the environment seems like it has to be orders of magnitude more complex than the (far from simple) task of making a true AI, simply because of scope.
Possibly. Make a slightly dumbed down AI, or a few hundred, and have them cracking at the idea for a while, along with humans to guide them where required.

Fuck, skip the whole issue. If you can create an AI, you can likely create something to house a human intelligence. Throw humanity into the box that is the universe and be done with it.
 

Users who are viewing this thread

Back
Top