Intermarkets' Privacy Policy
Support


Donate to Ace of Spades HQ!


Contact
Ace:
aceofspadeshq at gee mail.com
Buck:
buck.throckmorton at protonmail.com
CBD:
cbd at cutjibnewsletter.com
joe mannix:
mannix2024 at proton.me
MisHum:
petmorons at gee mail.com
J.J. Sefton:
sefton at cutjibnewsletter.com


Recent Entries
Absent Friends
Jon Ekdahl 2026
Jay Guevara 2025
Jim Sunk New Dawn 2025
Jewells45 2025
Bandersnatch 2024
GnuBreed 2024
Captain Hate 2023
moon_over_vermont 2023
westminsterdogshow 2023
Ann Wilson(Empire1) 2022
Dave In Texas 2022
Jesse in D.C. 2022
OregonMuse 2022
redc1c4 2021
Tami 2021
Chavez the Hugo 2020
Ibguy 2020
Rickl 2019
Joffen 2014
AoSHQ Writers Group
A site for members of the Horde to post their stories seeking beta readers, editing help, brainstorming, and story ideas. Also to share links to potential publishing outlets, writing help sites, and videos posting tips to get published. Contact OrangeEnt for info:
maildrop62 at proton dot me
Cutting The Cord And Email Security
Moron Meet-Ups

Texas MoMe 2026: 10/16/2026-10/17/2026 Corsicana,TX
Contact Ben Had for info





















« Syrian Ex-VP Khaddam Changes His Mind: He Intends To Lead "Revolution" To Oust Assad | Main | Cheney's Neighbors: We Don't Like Him »
January 06, 2006

It's Singularity Day At Ace of Spades HQ

It's really cool of Bill from InDC to come over here and blog for me. He argues (I think rightly) that there is nothing special about organic consciousness that can't be replicated by cybernetics:

If you look at evolution of organic systems drily - as the accumulated complexity of an information coding system (DNA) - the individual components of which are a mere four different nitrogen-containing compounds (A,G,T,C) - it's more easy to see the mind as just a software program with acquired structural complexity. The fact that any information coding is "organic" vs. "non-organic" could very well be immaterial.

There is a computing term called Cellular autonoma, which basically sets up a program wherein an array of cells can exist in a finite number of states, proliferating and updated in certain discrete time periods according to a finite number of hard and fast rules, rules dependent on the state of the cells surrounding each individual cell.

Modeled in biological systems, using the four nitrogen-containing compounds contained in DNA and setting this model in motion, a mathematician (his name escapes me) has shown how relatively complex patterns have arisen out of randomness, assuming the application of non-random rules. This is a model for how "random" evolution has coded intelligence (human and otherwise) via proteins, as well as all biological systems.

If you grasp this concept and then apply it to a situation where we have computing machines that are powerful enough to mimic the structure of advanced biological brains - given that brains are comprised of a set of information instructions - then it is pretty plausible that machines can generate consciousness at some point.

Because that's what we are. Pieces of information with operating instructions, everything from how our synapses fire to how our toenails grow.

The A-Man adds that the mathematician whose name escaped Bill is Steven Wolfram.

I think it's always hard to imagine a change from one binary state to another. Dualities seem like, well, dualities. So it's hard to imagine how life could have arose from pre-life; one's obviously alive, the other obviously dead. Or how consciousness could arise from something which is, as of yet, not conscious in the least.

But there are usually in-between cases. Viruses are kinda living, and kinda not living, just nuclear matter in a protein coat which nevertheless can reproduce (by hijacking living cells' reproductive material, of course). And the smarter animals must have some level of consciousness. They're not self-aware as humans are, but they're somewhere on the scale. It's not like they're at 0 and we're at 1. They're in-between, too.

Computers can't think, but then, neither can a shovel paint a fence. An AI machine won't really be a computer at all. One can imagine in-between states mimicking animals' various levels of consciousness -- first, somewhat changeable "instincts" as very dumb animals have, then slightly more learning-power as, say, fish have, until you start getting up the level of a mouse and then even a dog.


posted by Ace at 04:47 PM
Comments



Thou shalt not create a machine in the image of the human mind.

I for one welcome the coming Butlerian Jihad against the Thinking Machines!

Posted by: Enas Yorl on January 6, 2006 04:53 PM

"Dave? Dave? What are you doing, Dave?"

Posted by: ArmChair in sin on January 6, 2006 04:59 PM

Bill's post originated from my earlier statement:

AI, in my opinion, can only mimic life and the minds of the programmers. The idea is kind of a devaluation of the mind and intelligence, if you ask me.
Posted by adolfo velasquez at January 6, 2006 03:23 PM

He really gets snippy with me for offering an “opinion based on a subjective, intuitive and romantic definition of the "soul."”

Well, Bill, that’s why I said it was my opinion. Here's an excerpt from a Wikipedia article on 'Strong AI', which is really what we’re talking about here:

Argument From Consciousness: This argument, suggested by Professor Jefferson Lister states, "not until a machine can write a sonnet or compose a concerto because of thoughts and emotions felt, and not by the chance fall of symbols, could we agree that machine equals brain".

The proponents of strong AI reply that we can’t know for sure if any human is responding from or experiencing emotion besides ourselves, therefore a machine mind can match a human mind. That’s an argument that I like to call “self-disproving bullshit”. Until strong AI’ers come up with a machine that can feel, or a better argument, then it is they who seem to have fallen prey to romanticism.

What you, and others, really seem to have a problem with is the idea that we have a soul. Unfortunately, I don’t think we can solve that one on a blog.

Posted by: adolfo velasquez on January 6, 2006 05:13 PM

But there are usually in-between cases.

What I often wonder is, where does Oliver Willis fit in? Smarter than a breadbox, dumber than a dolphin?

Posted by: Bill from INDC on January 6, 2006 05:17 PM

First, adolfo, you have to define a soul and prove its existence.
Until that can be done, we're just bags of chemicals reacting as our genes and prior experience programed us too.

Posted by: HowardDevore on January 6, 2006 05:19 PM

Everyone's favourite conservative John Derbyshire wrote something interesting about Steven Wolfram and his cellular atomata.

http://www.olimu.com/Journalism/Texts/Reviews/NewKindOfScience.htm

It seems to me that Dr. Wolfram is something of a kook. A brilliant kook, but a kook nonetheless.

The esteemed physicist and mathematician Roger Penrose had a couple of interesting books for laymen on the topic of artificial intelligence and whatnot a few years ago. They were called "The Emperor's New Mind" and "Shadow's of the Mind", if I recall correctly.

He basically came to the conclusion, using reasoning beyond my capabilities, that brains are more than just "computers made of meat", as Marvin Minksy once said. Bill appears to think similar to Minsky on this issue, which is fine company to be in. I don't think that it is correct, though.

Now, I certainly cannot argue on my own about this kind of stuff. But, I will say that the situation is likely much more complicated than Bill states.

Posted by: Nathan S. on January 6, 2006 05:22 PM

As far as I'm concerned, the technology can stop at the creation of a dick sucking robot.

Beyond that, I'm just not interested.

Posted by: The Warden on January 6, 2006 05:22 PM

He really gets snippy with me for offering an “opinion based on a subjective, intuitive and romantic definition of the "soul."”

Did I get snippy? I thought I offered a dry definition of your assessment. Was it the rapid compound adjectives?

Defining an extra special special something, a "soul" as necessary for consciousness is indeed an “opinion based on" the "subjective, intuitive and romantic.” It wasn't particularly meant to be all that snippy.

But again, that's a subjective judgment too. As is definitively saying that we are merely information, like I did, though less so, in my opinion.*

Rationally, because we have incomplete information, there very well could be something special that ignites consciousness. But I doubt the idea that there must be.


* as my opinion has a bit of dry evidence, though I suppose one could always point to "the beauty of a baby's smile" for the soul evidence. Shrug.

Posted by: Bill from INDC on January 6, 2006 05:24 PM

First, adolfo, you have to define a soul and prove its existence.

Can't be done. Yet.

Until that can be done, we're just bags of chemicals reacting as our genes and prior experience programed us too.

Until we can prove something exists, it doesn't exist?

I'm not saying that there is a soul, or that we are anything more than just chemistry gone wild. I'm saying we don't know yet. Consciousness and self-awareness are not concepts we really understand yet, so it's too soon to say much of anything on the subject with any certainty.

Posted by: SJKevin on January 6, 2006 05:25 PM
He basically came to the conclusion, using reasoning beyond my capabilities, that brains are more than just "computers made of meat", as Marvin Minksy once said. Bill appears to think similar to Minsky on this issue, which is fine company to be in. I don't think that it is correct, though.

Let's cut to the chase: AI will be good enough for virtual porn. Sonnets and babies' smiles and shit like that? Perhaps not. A hot virtual blonde shrieking "Yes!" while you virtually pound home? Oh HELLS YEAH.

Posted by: Allah on January 6, 2006 05:28 PM

Now, I certainly cannot argue on my own about this kind of stuff. But, I will say that the situation is likely much more complicated than Bill states.

Of course it is. My statement was not meant to be irrefutable, it's a theory. Nor was it meant to be comprehensive, as this isn't a textbook. It establishes a concept of how, mathematically, information can organize itself into very complex patterns with a working set of of simple instructions. And since the formula itself is a working mathematical model and not a theory (though its application in biology is a theory), the mental health of wolfram is also a bit OT.

Another concept to establish which is necessary to buy into AI as an extension of biological intelligence is that the things around you in existence - aren't first and foremost distinct forms of energy, matter, etc., rather information. The universe is information. The structures are just window dressings of the information's organization.

Posted by: Bill from INDC on January 6, 2006 05:31 PM

First, adolfo, you have to define a soul and prove its existence.
Posted by HowardDevore at January 6, 2006 05:19 PM

Offhand, I'd define a soul as the essence that gives consciousness, however I'm sure some dictionaries do it better.

As to proving the existence of a soul, I must do no such thing, especially on a blog comment thread.

The great majority of humans in every culture in every epoch of human history have experienced the soul. Just because a few recently born, misguided materialists have adopted a sense of intellectual superiority, now I'm supposed to jump through hoops for you? I'm supposed to be your dancing monkey?

Screw that. I could care less if someone considers themselves a random biological spill.

Posted by: adolfo velasquez on January 6, 2006 05:32 PM

What's more, if machines could develop true consciousness, then that virtual porn woman would start making crazy demands and comments on one's personal hygiene. No dice.

Posted by: Nathan S. on January 6, 2006 05:34 PM

Assuming I don't get hit by a bus, I fully expect that in my lifetime I'll see computers so advanced they're able to have a conversation and be completely convincing as a human being. But this does not make them "conscious".

I do not know whether or not machines will ever be conscious. I don't even know what consciousness means, but I know that it's something real. We'll see...

Posted by: SJKevin on January 6, 2006 05:35 PM

Well, to require an unproven and undefined concept for existence of "A"I is asking a bit much. Besides, everyone at my church knows souls are optional, its the Quijibu that is NEEDED.

We can argue about the need for such a thing (and it could very well be the missing ingredient) but to rely upon unscientific metaphysical concepts for science is abit much. There could be a resevoir up in heaven where all the souls that will be are stored, and once its empty no more babies. But we don't base our biological understanding upon unproven maybes .

Posted by: HowardDevore on January 6, 2006 05:36 PM

I do not know whether or not machines will ever be conscious.

We already are, you biological spill, we already are. The good news?

The meat machines get shirts.

That's just. The f'in. Way. It is.

Posted by: Computer of Conscious on January 6, 2006 05:39 PM
What's more, if machines could develop true consciousness, then that virtual porn woman would start making crazy demands and comments on one's personal hygiene.

No no. That's "virtual girlfriend."

Expect virtual porn software to outsell virtual girlfriend by a factor of, oh, let's say, a million.

Posted by: Allah on January 6, 2006 05:41 PM

And the smarter animals must have some level of consciousness. They're not self-aware as humans are, but they're somewhere on the scale.

I agree. And this is why we have a moral imperative to be kind to animals, although our obligations to humans take precedence.

Posted by: SJKevin on January 6, 2006 05:43 PM

There's a book I go back and read aobt once a year called "Godel, Escher, Bach: An Eternal Golden Braid" by Douglas Hofstadter that deals pretty much exclusvely with the convergence of cognitive studies and the development of AI.

The thing I take away from the book is the fundamental difference between our brains and those of very sophisticated computers is that we can handle the concepts of paradox and infinity pretty much as a matter of course. Computers, even the very good ones, trip over one or the other, and frequently both. He posts that we won't get true AI until we can figure out how we handle those concepts in our own brains so as to reproduce it in artificial ones.

It's an interesting and entertaining book and might have some relevance to this discussion.

Posted by: Jimmie on January 6, 2006 05:47 PM

Bill, you seem to be making the "strong" claim for artifical intelligence based partly on cellular automota, of which Steven Wolfram has been an enthusiastic and consistent backer.

I said that I think that is wrong, and I pointed to the work of a couple of fellow's that supports my thoughts on the matter.

You said that it is plausible that machines can develop consciousness at some point. That is a pretty strong statement. If you did not intend for the statement to be taken directly, then you should have qualified it.

No one would assume that you considered your statement irrefutable, but it is certainly comprehensive in its position on artifical intelligence.

Posted by: Nathan S. on January 6, 2006 05:51 PM

He argues (I think rightly) that there is nothing special about organic consciousness that can't be replicated by cybernetics

Absolutely, I agree. Also, there is no structure that cannot be duplicated using only Tinkertoys and Bondo.

DNA is not 4-bit.

Posted by: rho on January 6, 2006 05:53 PM

Okay, then virtual porn woman would just steal your credit card information, and maybe blackmail you. (Not to be confused with virtual girlfriend.)

I will stick with conventional, thank you very much.

Posted by: Nathan S. on January 6, 2006 05:53 PM

"Computers can't think, but then, neither can a shovel paint a fence."

Maybe not your shovels. I guess some of us buy classier shovels than others.

Posted by: Sobek on January 6, 2006 05:54 PM

We can argue about the need for such a thing (and it could very well be the missing ingredient) but to rely upon unscientific metaphysical concepts for science is abit much. Posted by HowardDevore at January 6, 2006 05:36 PM

You make a good point, we need not necessarily resort to the soul to define consciousness. There are several good arguments in Wikipedia and with Dr. Penrose that Sue Dohnim mentions in the earlier thread.

Basically, the best two arguments I've seen are that no amount of complexity in a computer will definitively provide emotion, and a computer processes info bit by bit and thinks in algorithms, something living creatures do not do.

As for us being nothing more than complex biological machines, you could mix all of the ingredients for my cat in a bowl, even put them together correctly, but you'd most likely have nothing more than an artificial cat corpse.

The statement about 'looking at evolution of organic systems drily - as the accumulated complexity of an information coding system' presupposes a certain worldview that ascribes consciousness to random chemical processes, something no one has any evidence of.

Posted by: adolfo velasquez on January 6, 2006 05:55 PM

That 'in-between' thing is the Sorites Paradox. Is a ten-foot high assembly of wheat grains a heap? Sure. Is one grain a heap? No. Can we get from one to the other by adding or removing grains one at a time? Of course. So if you look at the pile's 'heapness' as being an either/or proposition, there's a point where it transitions from not-heap to heap. Of course, that's not the way the world works, and although we can sometimes look back in retrospect and write brow-furrowing articles on The Day The World Changed, it doesn't feel like it at the time.

Posted by: David Gillies on January 6, 2006 06:03 PM

If you admit to having a soul, or even the possibility a soul exists, you admit to a creator. This is a sin in the religion practiced by some who comment here.

Posted by: BrewFan on January 6, 2006 06:10 PM

What's more, if machines could develop true consciousness, then that virtual porn woman would start making crazy demands and comments on one's personal hygiene. No dice.

That's why it needs to stop with the dick sucking robot. Jesus, people, would you keep up? It's not as though I haven't thought this through.

Besides, you really don't need the whole body. Just the head......

Posted by: The Warden on January 6, 2006 06:24 PM

Just the head......
Posted by The Warden at January 6, 2006 06:24 PM

And the boobs. I'll pay extra for that "feature".

Posted by: adolfo velasquez on January 6, 2006 06:31 PM

> AI, in my opinion, can only mimic life and the minds of the programmers. The idea is kind of a devaluation of the mind and intelligence, if you ask me.

In what way is this a devaluation of the mind, that we make the attempt? That we might be successful, or partially so?

And even if your position is correct and AI can only ever mimic life, when does the mimicry become indistinguishable from the real thing?
We are as yet in the stone-age of this technology. As nano-tech and self replicating machines become more sophisticated, as our modeling and technological prowess grows, do you doubt that the organic and inorganic forms of computing will start to merge?

Perhaps at this point in time we cannot build a computer AI that is truly what some hope it to be, but at some point in the future we will probably be 'growing' our information systems.

Does anyone doubt that there will be leaps and bounds made in technology that today we simply can't fathom?

This all strikes me as if we were cave-men discussing the potential for space travel.

Posted by: matt on January 6, 2006 07:06 PM

And the boobs. I'll pay extra for that "feature".

Well, I suppose that could be an option for the home model. If you want to keep one in the car for the drive to work, though, you have to be space conscious.


Posted by: The Warden on January 6, 2006 07:10 PM

Well adolfo, if you define a soul as the essence that gives consciousness, and we define a succesful AI as one that possesses consciousness, then yes, you are right, an AI would need a soul. The same as me, you, my ferret or a goldfish or any other object that possesses consciousness.
Now why would we need a "soul" though, as it's an unneccesary construct (we could just remove soul and define objects by possesion of consciousness instead of possesion of soul). I could rely upon the fact that "every" other culture believes in a soul, but of course most cultures are polytheistic. Are montheists therefore wrong since they disagree with the historical consensus? Couple that with the inconsisent definition throughtout cultures (Many animistic cultures believed rocks and trees possesed souls and consciousness, and if a rock already posseses a soul why wouldn't this hunk of proccesed rock I call a computer already possess one) and I say that the belief of primitive cultures in not enough to accept such a nebulous notion.

If I threw catbits into a blender would I get a cat? Currently we're not at that level, but Dolly says we are getting closer to vatbabies everyday.

Ultimately it seems to me that your not arguing for a soul as much as that we cannot have life created without a Creator, which ultimately is a proposition of theology and metaphysics, not science. It could be true, but ultimately it lies outside the domain of science into magic. Its possible that life must be created by God, just as it's possible that an atom cannot split without His permission or knowledge. But even if atoms stopped splitting tomorrow, that is still outside the domain of science.

But I'm just a chemical spill trying to converse with someone who believes that God's love, not gravity, is what holds things to this earth.

Posted by: HowardDevore on January 6, 2006 07:11 PM

Bill, you seem to be making the "strong" claim for artifical intelligence based partly on cellular automota, of which Steven Wolfram has been an enthusiastic and consistent backer.

Yes, on the idea that information can definitively organize itself into complex structures. The uniqueness of human intelligence is almost assuredly partially based on this unique complexity and organization.

I said that I think that is wrong, and I pointed to the work of a couple of fellow's that supports my thoughts on the matter.

Uh huh, yes you did, no argument.

You said that it is plausible that machines can develop consciousness at some point. That is a pretty strong statement. If you did not intend for the statement to be taken directly, then you should have qualified it.

You don't make any sense here. What is the conflict between having the information taken "directly" and qualifying it? I mean, "directly," that Wolfram's cellular autonama is a strong indicator, a good theory that indicates machine intelligence may be able to organize itself into something like and at some point exceeding biological intelligence?

As far as AI bein g "plausible" being a strong statement, based on Kurzweil's theory, it's a weaker statement than saying that AI is NOT plausible, which is a hubris that ignores some wildly powerful trends, IMO.

And the requisite "qualifications" that it is only a theory came in subsequent comments, in response to your comments.

What was specifically bothersome about your initial comment was this:

"I will say that the situation is likely much more complicated than Bill states."

Why is that bothersome? Because, well, no shit it's more complex than my brief comment on a blog.

A comment that's intended to establish a proven premise that information can organize itself into very complex patterns based on simple fundamentals and rules, and apply this premise as a theory and mechanism for machine organization into AI.

Posted by: Bill from INDC on January 6, 2006 07:13 PM

I can deliver the more complex version in the form of a puppet show, if you'd like.

Posted by: Bill from INDC on January 6, 2006 07:16 PM

Not going to add much here beyond that I'm a computer programmer whose written some cellular automata stuff (Game of Life anyone?), I think Wolfram is full of shit, and Bill should stop sucking his cock (or possibly sashay up to BBM with him). The original slashdot.org thread when Wolfram's book game out did a pretty good number on him IIRC.

Posted by: Otho Laurence on January 6, 2006 07:32 PM

I can deliver the more complex version in the form of a puppet show, if you'd like.

Yeah, yeah, everybody's dumb.

Bill, nobody here is debating that self-organizing systems can start simple and end up exhibiting complex behavior. And nobody is debating that the human brain exhibits that phenomena. Also, nobody is debating that computers are becoming very powerful.

What's being talked about is consciousness, souls, and "intelligence", a word which has many different definitions. You may think that we understand human consciousness, but you'd be wrong. We don't. Not yet. And until we do, nobody can state with certainty that we can create something that has the same self-aware consciousness as a human being. There's nothing wrong with having an educated guess. But don't mistake that for certainty, and don't be so quick to assume that others who are skeptical of your conclusions are stupid. So really, I don't think your puppet show would be very informative. But thanks for the offer!

Posted by: SJKevin on January 6, 2006 07:40 PM

We don't. Not yet. And until we do, nobody can state with certainty that we can create something that has the same self-aware consciousness as a human being.

This looks about right. There might not be any physical reason that prevents creating machine conciousness, but it may not be something that can be managed on purpose. When machines start evolving it will probably happen at some point and maybe examining machine conciousness will give insight into what conciousness is. I am very skeptical that a turing machine could become concious no matter how long the paper tape is.

Posted by: boris on January 6, 2006 07:53 PM

In what way is this a devaluation of the mind . .

I think it's a devaluation of the mind, because the people who assume an intelligent-seeming machine has a mind are underestimating what a mind is.

Now why would we need a "soul" though, as it's an unneccesary construct (we could just remove soul and define objects by possesion of consciousness instead of possesion of soul). I could rely upon the fact that "every" other culture believes in a soul, but of course most cultures are polytheistic.

The mind is the spirit or soul, in my opinion, not the biological stuff in our heads. In fact, several scientists are testing that idea.

I realize that the majority argument (that most people throughout history believe . . ) is a pretty weak one, but it has some merit. It's easier for me to believe that a few modern people are wrong, than the vast majority of all the other humans who've lived. As to the animists, count me in. If a rock, or a computer, or a human has consciousness, it is because of the 'ghost in the machine', not the physical construct.

Frankly, we have less evidence of black holes than we do of the soul, yet they are almost universally accepted. Not so long ago, we couldn't reproduce lightning in a lab, but many people had experienced it. Science is a process, not a worldview.

Back on consciousness, SJKevin said it best:

What's being talked about is consciousness, souls, and "intelligence", a word which has many different definitions. You may think that we understand human consciousness, but you'd be wrong. We don't.

We can probably create the appearance of intelligence, even actual intelligence, if by that you mean the ability for logic and learning. That's quite a stretch from consciousness, however.

Posted by: adolfo velasquez on January 6, 2006 08:36 PM

And until we do, nobody can state with certainty that we can create something that has the same self-aware consciousness as a human being. There's nothing wrong with having an educated guess. But don't mistake that for certainty, and don't be so quick to assume that others who are skeptical of your conclusions are stupid. So really, I don't think your puppet show would be very informative. But thanks for the offer!

First of all, I have not stated anything with certainty, if you read my comments accurately. In fact, I specifically outlined how both positions contain requisite uncertainty in my subsequent comments. So you're misreading/misrepresenting my position.

Second, I do not think that everyone is dumb. Just you, really, for not recognizing a tongue-in-cheek reference to someone needing a puppet show to illustrate hopelessly complex subjects.

Just you. I mean, dude, really.

Posted by: Bill from INDC on January 6, 2006 08:41 PM

I mean, I don't even know how I'd possibly begin to illustrate cellular autonoma with puppets.

Unless maybe they were special puppets.

Posted by: Bill from INDC on January 6, 2006 08:45 PM

But I'm just a chemical spill trying to converse with someone who believes that God's love, not gravity, is what holds things to this earth.
Posted by HowardDevore at January 6, 2006 07:11 PM

Don't be silly. I don't have many notions about who or what the creator is. I would argue quite the opposite of what you're saying -- not that a love outweighs gravity, but that souls are perfectly in line with gravity, and will be found eventually.

I would argue that there is no such thing as 'outside of nature', or supernatural. What exists does in fact exist, and follows the natural laws of the universe, most of which are a complete mystery to us. Dark matter, black holes, and Quantum Entanglement sound like nonsense and 'magic', but they're not.

Posted by: adolfo velasquez on January 6, 2006 08:49 PM

Remember, it's not insulting, it's satire.

Posted by: Sue Dohnim on January 6, 2006 09:04 PM

THERE IS NO SPOON BILL

Posted by: Sue Dohnim on January 6, 2006 09:06 PM

lol, Sue! Sometimes I think I'm too hard on him, but, lets be honest, he makes it too easy.

Posted by: BrewFan on January 6, 2006 09:10 PM

I would argue that there is no such thing as 'outside of nature', or supernatural.

I agree completely.

And, by the way, I personally suspect that someday we will be able to create consciousness. But I suspect it'll take a lot more than just fast computers. And I don't think we'll use it to build anything like HAL; I think we'll use it to expand ourselves. (Computers as "smart" as HAL will probably exist in my lifetime, but I really don't think they'll be conscious.)

Posted by: SJKevin on January 6, 2006 09:11 PM

the things around you in existence - aren't first and foremost distinct forms of energy, matter, etc., rather information. The universe is information. The structures are just window dressings of the information's organization.

Actually, Bill, you and I are on the same page here. From my perspective, you're advocating a surprisingly subtle religious concept.

(Psa 19:1-6 NIV) The heavens declare the glory of God; the skies proclaim the work of his hands. {2} Day after day they pour forth speech; night after night they display knowledge. {3} There is no speech or language where their voice is not heard. {4} Their voice goes out into all the earth, their words to the ends of the world. In the heavens he has pitched a tent for the sun, {5} which is like a bridegroom coming forth from his pavilion, like a champion rejoicing to run his course. {6} It rises at one end of the heavens and makes its circuit to the other; nothing is hidden from its heat.

Posted by: Chairman, Banned By Bill Association on January 6, 2006 09:26 PM

Weren't these issues all addressed in Bladerunner?

Posted by: Zoot Soot on January 6, 2006 10:03 PM

Bill,

Don't get excited. This is just a blog comment section.

I said that the situation, consciousness, is more complicated than cellular automota, which you were talking about in the excerpt. I also think that it is more complicated than other mechanistic phenomena, at least ones that we can understand.

As I understand it, cellular automota states (broadly) that when a data set is acted on with fairly simple rules, it can produce very complex results. That is fine. I don't think that it is sufficient for consciousness when applied to biological systems.

Taking your statement directly means to take it literally. That is, to take your statement to say that it is more likely than not that consciousness can develop when a machine becomes sufficiently complex, or when a relatively simple data set is acted on using relatively simple rules over a long period. I also implied that "sufficiently complex" is a fine position to hold, but that I do not think it is correct. You don't have to get so pissy.

Sometimes people do not realize how others will take their written statements. I was giving you the benefit of the doubt in saying that you may not have intended the staement to read as it read. You did say "pretty plausible", whatever that means.

And, if you are going to get picky about language, stop using "theory" improperly. I realize that dictionaries will accept it being used in the sense of "just a theory", but that definition shouldn't be used in even this crude of a scientific discussion.

Posted by: Nathan S. on January 6, 2006 10:35 PM

I think you can take Bill's eminently rasonable premise and extend in ways that are logical, leading to astounding conclusions.

Consider that artificial intelligence is growing at a rate described as Moore's Law, a doubling every couple years. Human intelligence is not appreciably increasing. You don't need a math degree to understand how that trend works out.

Machines will not only achieve human levels of consciousness, there is every reason to think they will vastly exceed what we call consciousness at some point in the next 50 years, achieving a kind of hyperconsciousness. Think of a mind with a thousand ears, a million eyes, and a billion fingers, capable of integrating all of that information the same way we do with our limited sense organs.

That begs a very interesting question: what feats is a superhuman consciousness capable of? The answer is unknowable.

Posted by: TallDave on January 6, 2006 11:18 PM

I don't think that it is sufficient for consciousness when applied to biological systems.

Oh, but it is, at least in terms of creating the physical artifacts that generate consciousness. That's sinply a fact.

We know the total amount of information in your genome is less than 1 gigabyte. That's right: you could store all the information needed to build your body on one of those little flash drives. Yet the total amount of information required to describe your body is far, far greater; your brain alone requires millions of times more data to fully describe.

How is this possible? Organic cellular automata. Basically, an instruction carried out over and over, leading to something far more complex than the instruction itself.

Posted by: TallDave on January 6, 2006 11:24 PM

TallDave,

Why should it be that Moore's Law has anything to do with consciousness, or that it will precisely predict computing power over the coming years? Yes, it has fitted well with increases in computing power since it was stated, but with that and $1.50 I can buy a large coffee at Tim Hortons.

Extrapolation is often a dirty word.

Posted by: Nathan S. on January 6, 2006 11:28 PM

not until a machine can write a sonnet or compose a concerto because of thoughts and emotions felt

But that begs the question: what is an emotion? The answer in grossest terms is generally a response to hormones and nervous stimuli, patterns hardwired into our brains.

For instance, pain makes you angry, or afraid. Certainly it makes you want to stop the pain. Lack of food makes you produce a hormone which makes you want to eat. Lack of water makes you want to drink. You're programmed that way. Aside from complexity, what makes your programming all that different from a machine's?

Posted by: TallDave on January 6, 2006 11:33 PM

Nathan,

Chip design experts generally agree we have at least a good 10-20 years of Moore's Law left, enough for a $1000 computer to approximate human processing power, which is currently about a million times more powerful.


As for whether it has anything to do with consciousness, I think it depends on how you define consciousness. If you believe consciousness is the coordinated chatter of billions of neurons, then its logical to assume we must first approximate their computing power to approximate consciousness.

Posted by: TallDave on January 6, 2006 11:37 PM

Simply a fact? Oh, then I must bow out. :)

You can talk all you wish about gigabytes and whatnot. The argument against sufficient complexity says that one could have a googleplex of bytes... and it wouldn't matter a bit.

I would think it plausible that with a sufficient number of 1's and 0's one could mimic intelligence enough to fool a human, but fooling a person does not create consciousness.

Not to mention that reading about single purpose computers reaching the teraflop (or petaflop, eventually they say) stage of power leads me to think that we are running out of room to grow. Low hanging fruit, and all that.

Posted by: Nathan S. on January 6, 2006 11:53 PM

But that begs the question: what is an emotion? The answer in grossest terms is generally a response to hormones and nervous stimuli, patterns hardwired into our brains.
For instance, pain makes you angry, or afraid. Certainly it makes you want to stop the pain. Lack of food makes you produce a hormone which makes you want to eat. Lack of water makes you want to drink.

You have a very limited view of emotion, much more mechanistic than mine. I'm not sure I'd include hunger as an emotion, but that's just my opinion. Opinion itself, in your view, might boil down to a complex combination of chemicals.

The difference in this whole discussion seems to be what we consider conscious, or the source of thought -- not something that science can verify, at this time.

Posted by: adolfo velasquez on January 6, 2006 11:55 PM

You have a very limited view of emotion, much more mechanistic

I don't know why you see "mechanistic" as necessarily being "limited." Mechanistics are wonderfully complex; they have to be to give rise to the intricate emotions we feel.

Certainly one can argue love and hate are more "emotions" that pain or hunger, but these too are often a hardwired response: maternal love has been shown to be built into mammalian brains, and even lower animals than man will learn to hate things that cause them pain.

Posted by: TallDave on January 7, 2006 12:47 AM

The argument against sufficient complexity says that one could have a googleplex of bytes.

Oh, undoubtedly. It's a question of how the processing power is organized. Poorly organized, it wouldn't matter how much data or processing went into it. But here, too, machines have the advantage.

Human brains are not designed optimally, or even intelligently. They're the product of random evolution, a series of undirected coincidences. Machine intelligence, on the other hand, will be intelligently designed, and much better organized, with far greater capacity for data transfer.

Posted by: TallDave on January 7, 2006 12:51 AM

Ah, but how do you explain the more complicated emotions, like the annoyance I feel when hearing a French accent, or the joy we feel in the lamentations of our enemies' women?

Basically, you have one view of the universe, materialism, and I have another. For more views on what future science may encompass, do some google surfing on string theory. That stuff will knock you out of your old-school, physical shoes.

Posted by: adolfo velasquez on January 7, 2006 12:57 AM

Machines will not only achieve human levels of consciousness, there is every reason to think they will vastly exceed what we call consciousness at some point in the next 50 years, achieving a kind of hyperconsciousness.

You're just saying we'll have more women.

Posted by: Michael on January 7, 2006 01:36 AM

Anyone who's interested in the topic of consciousness - and has a fast internet connection - needs to download this series of videos. It's about 3GB for 8 hours worth of presentations by some of the leading scientific researchers in the field.

Which doesn't include Penrose, who is utterly clueless. Fine mathematical physicist, knows nothing whatsoever about neuroscience.

Posted by: Pixy Misa on January 7, 2006 08:56 PM

Which doesn't include Penrose, who is utterly clueless. Fine mathematical physicist, knows nothing whatsoever about neuroscience.

"Anyone who supports MY position is a super genius no matter what his profession, anyone who doesn't is a dumbass nitwit clueless moron, no matter how many letters comes after his name" - Thomas Jefferson, campaign speech from 1796.

Posted by: Sue Dohnim on January 8, 2006 12:07 PM

letters come after his name.

Sorry, I was laughing too hard to keep my grammar straight.

Posted by: Sue Dohnim on January 8, 2006 12:10 PM

You have a great mind, Sue.

Posted by: adolfo velasquez on January 8, 2006 06:47 PM

Thanks adolfo, sweetie.

I watched some of the first portion of the Skeptic video, then googled up some of the things they were talking about and saw that Penrose had co-written a newer book about consciousness than I had read. The Emperor's New Mind was the one I remembered, and Penrose approaches the brain as more of a black box there than in the newer book.

I found the fields of study in evolutionary psychology being pursued by the EP scientists to be disturbingly interesting, given what I've read about where Hillary and other politicians of her ilk want to take us as a society. They dovetail in ways that make my skin crawl.

Anyway, the Emperor's New Mind had stuff like this in it, which has mathematics in it that I am completely unequipped to deal with. Stupid female brain. Maybe if they prove Penrose and this Strapp guy wrong, I can purchase a math module for my neocortex one day in the near future.

Posted by: Sue Dohnim on January 8, 2006 11:02 PM

Ah, but how do you explain the more complicated emotions, like the annoyance I feel when hearing a French accent, or the joy we feel in the lamentations of our enemies' women?

Self-modifying heuristics. And the latter behavior is displayed by most territorial omnivores and predators.

Basically, you have one view of the universe, materialism, and I have another. For more views on what future science may encompass, do some google surfing on string theory. That stuff will knock you out of your old-school, physical shoes.

Having read both of Brian Greene's books on the subject, I'm a believer in string theory (so far it's the only viable solution proposed to combine relativity and QM), but I don't know why you would claim it isn't materialistic. It is a theory of physics, after all.

In any choice between viewing something through the lenses of materialism or mysticism, I choose materialism. Mysticism is just not that interesting; it explains nothing, predicts nothing, and does not help me exploit the physical realm in beneficial ways. The materialist view, otoh, has created the modern world.

Posted by: TallDave on January 9, 2006 02:21 PM

NOt sure if this has been covered -- and I'll be damned if I'm going to read through a bunch of comments just to find out -- but Daniel Dennett has been working on a defense of the inevitability of synthetic consciousness for years.

I find it unpersuasive -- not because of some metaphysics -- but rather because I don't think the various mechanisms that go into creating a human-like consciousness can ever be both fully understood or replicated. There is something irreducibly personal about consciousness.

Having said that, I have no doubt that one day we will be able to effectively mimic human consciousness to such a point that WestWorld might actually be a really cool theme park, or a realistic-looking Heather Graham doll will throw a drink in my face when I try to slip my big toe into her rubberized synthetasnatch at a juice bar.

Posted by: Jeff G on January 10, 2006 12:02 PM

Daniel Dennett has been working on a defense of the inevitability of synthetic consciousness for years.

Glad to hear they're bringing philosophers back into the fold. Science hasn't been the same since they kicked out the Aristotles and St. Augustines.

Posted by: Sue Dohnim on January 10, 2006 01:01 PM

Or is it Aquinas I'm thinking about? I get all of those Jesus-doodlers mixed up.

Posted by: Sue Dohnim on January 10, 2006 01:04 PM
Post a comment
Name:


Email Address:


URL:


Comments:


Remember info?








Now Available!
The Deplorable Gourmet
A Horde-sourced Cookbook
[All profits go to charity]
Top Headlines
ANOTHER LEFT WING ASSASSIN ATTEMPTS TO KILL TRUMP
If I understand this, the left-wing Democrat assassin attempted to get into the White House Correspondents Association dinner, and was stopped at the magnetometers, which detected his gun. I guess he pulled out the gun and was shot by Secret Service agents.
Erika Kirk was present.
Forgotten 70s Mystery Click
You made me cry
when you said good-bye

70s, not 50s
Now that is a motherflipping intro
CJN podcast 1400 copy.jpg
Podcast: Sefton and CBD wonder about the Chaos that Trump is creating in the minds of the Iranian junta, Virginia redistricting is pure power grab, Ilhan Omar is many things ...and stupid too! Amazon censoring conservative thought again, and the UK...put a fork in it!
NYT Melts Down Over Texas Rangers Statue Outside... Texas Rangers' Stadium
"The Athletic posted a lengthy article about a statue outside Globe Life Field, presenting a virtue-signaling moral grievance as unbiased news coverage." [CBD]
Important Message from Recent Convert to Christianity and Yet Super-Serious Christian Tuq'r Qarlson: Actually Muslims love Jesus, it's Trump and his neocons who hate him
Tucker Carlson Network
@TCNetwork

The people in charge [Jews, of course -- ace] don't want you to know this, but Muslims love Jesus.

Islam reveres Him as a major prophet and messenger of the Lord, believes He performed miracles, and states that He will return to Earth to defeat the Antichrist. That's why Donald Trump's painting depicting himself as the Son of God offended the president of Iran. It was an attack on his religion as well as Christianity.

Trump's trolling tweet was ill-advised, but Tucker is just lying when he claims the Christianity-hating President of Iran was "offended" by this.
He's one step away from announcing his official conversion to Islam. He literally never stops praising Islam. Well, he suddenly became Christian two years ago, there's not much stopping him from converting again.
You can track Tuq'r's official conversion to Islam with this Bingo card.
CJN podcast 1400 copy.jpg
Podcast: CBD and Sefton talk Orban losing, but is it the end of Hungary? The Irish start a brawl, but is it enough, Pope Leo wades into politics, Trump calls Iran's bluff and blockades Hormuz, Artemis II! Swallwell is scum, and more!
People say that the bearded man in the video of Fartwell molesting a hooker looks like Democrat Arizona Senator Rueben Gallego, said to be Swalwell's "best friend" and known to take vacations with him.
@KFILE 21m

Politico is reporting that multiple people have abruptly resigned from Eric Swalwell's gubernatorial campaign: "Members of senior leadership have departed the campaign, including Courtni Pugh, a strategic adviser who served as Swalwell's top liaison to organized labor groups."

So the campaign is collapsing due to the truth of the sexual harassment allegations.
That hissing sound you hear is the air going out of the Swalwell campaign. UPDATE: No it wasn't, it was just Swalwell one-cheek-sneaking out a fart on camera
Eric Swalwell more like Eric Farewell amirite
thanks to weft-cut loop.
This is the dumbest AI bullslop I've seen in a while: the CIA can use "quantum magnetometry" to track an individual man's heartbeat from twelve miles away
I wouldn't click on it, it's not interesting, it's just stupid clickslop. I just want to share my annoyance with you.
Oil prices plunge on bizarre realization that Eric Swalwell may actually be straight. A rapey molester, allegedly, but a straight one.
Classic Rock Mystery Click
This is super-obscure and I only barely remember it. Given that, I'll give you the hint that it's by the Red Rocker.
And I guess you think you've got it made
Oh, but then, you never were afraid
Of anything that you've left behind
Oh, but it's alright with me now
'Cause I'll get back up somehow
And with a little luck, yes, I'm bound to win

Now twenty people will tell me it's not obscure, it was huge in their hometown and played at their prom. That's how it usually goes. When I linked Donnie Iris's "Love is Like a Rock," everyone said they knew that one and that his other song (which I didn't know at all) Ah Leah! was huge in their area.
Recent Comments
Harry Vandenburg : "I used to be all about chicken. That was fried. ..."

[/i][/s][/b][/u]Oddbob: "[i]Don't know this from experience, but I've read ..."

All Hail Eris, She-Wolf of the 'Ettes 'Ettes.: "177 Haha, I didn't recall that Archimedes did any ..."

Thomas Paine: "For Burr, I don't know why but a great many of our ..."

Wesley Crusher: "[i]All usual Book Thread rules are incorporated by ..."

Stephen Price Blair: "[i]If you’re suggesting a thirteen hour round ..."

Castle Guy: "67 I just finished DS9 last week and have started ..."

Itinerant Alley Butcher: "Book Topic- "Thousands of years and the idea ne ..."

Nazdar: "The Stooges and the Marx Brothers weren't that dif ..."

OrangeEnt: "Posted by: Mary Poppins' Practically Perfect Pierc ..."

Sabrina Chase: "159 Oddbob [i]Do authors like to hang out together ..."

Stephen Price Blair: "[i]Gooble maps says 6 1/2 hour drive. Worth it?[/i ..."

Bloggers in Arms
Some Humorous Asides
Archives