#1 |
#2 |
#3 |
Paste this URL into an email or IM: |
Click here to send this debate via your default email application.
|
Click here to login and CreateDebate will send an email for you.
|
do animals have consciousness?
yes
Side Score: 60
|
no
Side Score: 30
|
|
There are many levels of consciousness ranging from the simple survive and perpetuate yourself all the way up to deep contemplation. It's a sliding scale, not either you have it or you don't. Heck, as much credit as people give themselves for having consciousnes there are definitely some human mouth breathers out there who don't really have it. And there is a tremendous arrogance to humans declaring they are the only ones capable of being conscious. When humans write into their ancient texts that indeed they are and God is the reason and animals are just there for us to eat and use that whole shebang is simply humans trumpeting humans. It proves no more than if a termite colony could communicate to you the reasons it deserves the eat your house. Side: yes
1
point
all the way up to deep contemplation. I wouldn't go ahead and say that humans are the highest possible level of consciousness. There are many levels of consciousness ranging from the simple survive and perpetuate yourself all the way up to deep contemplation. Which isn't much different from machines in design. Why limit yourself to evolutionary, organic systems? Are you implying that only a system that has evolved under natural selection, rather than been designed by another, capable of being conscious? Side: no
Are you implying that only a system that has evolved under natural selection, rather than been designed by another, capable of being conscious? To date, we can only reasonably believe that organic evolved brains have consciousness. We don't know enough about the phenomenon of consciousness to know whether designed systems can be conscious. So far it is science fiction. Side: yes
1
point
1
point
1
point
I'm not a creationist. Yes, I wanted to see whether you understood that you were implying it in that claim of yours. Since you seem to, we can continue. We can reconstruct your claim (along with the argument) as, Since we haven't observed creationism so far, according to our knowledge, it is necessarily false. It's also evident by you not disputing this conclusion from the previous argument. But just in case that you might not have guessed that to be possible, I'll be asking it again (much more explicitly this time), Are you fine with it so far? Side: no
Since I am not a creationist, I cannot speak in those terms. Nor is a theory of our beginnings a good analog for consciousness since consciousness is always experienced, but never observed. As with any strictly subjective phenomenon. When I said that robot consciousness was science fiction, I did not mean that it is necessarily false. But rather that we don't not have current sufficient reason to believe it is true. Side: yes
1
point
But rather that we don't not have current sufficient reason to believe it is true. And that somehow warrants us to use a circular definition? You might as well 'speak in those terms' if you believe that consciousness should be defined in evolutionary terms/needs. I did not mean that it is necessarily false. You should know what happens when you use limiting definitions. I can tell you, though. Everything against it becomes necessarily false. Side: no
1
point
I do think animals are conscious, simply because their brains are so similar to ours and we have consciousness (lets not go into zombie theory). They seem to demonstrate emotions, and while consciousness is by nature untestable even in humans, the only thing they lack relative to humans is attestations of consciousness. While their consciousness would be more primitive and presumably unequipped with language I still believe it exists. I don't think most people like to think about this though, as they eat factory farmed food instead of free-range. Side: yes
Consciousness can be defined that way, however another definition was being used by both the questioner and myself. The definition we are using is the subjective properties of experience, or qualia (source 1, definition 2). If you're genuinely interested in consciousness as we are referring to it then check out the link below (source 2) (1) http:// Side: no
I get that you might find it hard, but at least try to read the link I gave you. Here I'll give you it again. Side: no
1
point
1
point
1
point
It's possible to have a robot that makes decisions on its own, without being compelled, and yet has no experience of the decision making process. I'd like to see it, for there seems no reason to believe that such a scenario might be possible. Thus, your test for consciousness is not sufficient to determine consciousness. I don't remember what exactly that refutation was called. The philosophical zombie problem, perhaps? Side: no
We have more reasons than just their decision making process to believe thy animals have consciousness. The philosophical zombie takes the concept to the extreme and is only refitted by a dislike for the idea of solipsism. But my point isn't so extreme. It is their experience that gives them concsiousness, not their decision making, which can be simulated. Side: yes
1
point
The philosophical zombie takes the concept to the extreme and is only refitted by a dislike for the idea of solipsism. Or it can be dismissed as an invalid problem under materialism. It's a problem only for dualists, after all, for you have to believe that there are unobservable entities to account for experience. It is their experience that gives them concsiousness, Sounds a bit bold for a leap. We have more reasons than just their decision making process to believe thy animals have consciousness. I wonder about that. Looking at your argument in this debate, it seems merely like a restatement of mine with some unreliable instincts given a higher priority over any possible attempt to test it out. Side: no
Or it can be dismissed as an invalid problem under materialism Only if materialism denies that the phenomenon of experience exists. Experience is a phenomenon that requires an entity, though not a mystical or unobservable entity. Making that leap is likely derived from your faith in the idea that you can experience doubt that you experience. It's groundless. You might as well say that animals do not have consciousness because no one does. The fact that experience is the necessary factor in determining consciousness isn't a bold leap, nor is calling it that an actual argument. When someone says "they are conscious of X" they are saying that someone is aware of X, which is experience. Looking at your argument in this debate, it seems merely like a restatement of mine with some unreliable instincts given a higher priority over any possible attempt to test it out There is no way to fully test out consciousness. Consciousness is a completely subjective experience. We have every reason to believe animals are conscious because of correlating phenomenon. We have less reason to believe simpler forms of life are conscious because there is less observed phenomenon present that correlate with consciousness. That's why you referred to it as automatic stimuli response. But in a causal and complex world, it can all be chalked up to automated response. My point is that decision making may be a necessary component to determining the consciousness of another, but it is not sufficient. In a novel called "Johnny got his gun", a soldier is blown up by artillary. He is lacking his limbs and his sense/communication organs. However, his mind is fully functional. He is not believed to be conscious. He cannot make any material decisions. Eventually he bangs his head (considered mere automatic response earlier in the book) in Morse Code and he successfully communicates. He can now make decisions. It is now known that he was conscious all along. He did not become conscious only when he re-acquired the ability to make material decisions. Or, as you said earlier, you can dismiss the problem as it is not one that fits your faith. Side: yes
1
point
Only if materialism denies that the phenomenon of experience exists. Or perhaps it can claim, as it incidentally does, that experience is based on observable material phenomena and thus the problem is invalid. Are you even trying? My point is that decision making may be a necessary component to determining the consciousness of another, but it is not sufficient. Though you say that, if something is able to portray autonomous decision making, even your unreliable instincts would say that it is conscious, as is evident from your criteria. When possible, I'd prefer to avoid such 'I'll know it when I see it' approaches, unlike you. However, since you brought it up, consciousness refers to the ability to perceive and alter one's surroundings (and, of course, the decision making process connecting the two). If something is altering it's surroundings in sequences that represent free will rather than automated responses, it can be concluded to be conscious. Unless you want to include a soul somewhere in the model, but I guess that can be ruled out since you declined on having talked about the philosophical zombie problem. There is no way to fully test out consciousness. Consciousness is a completely subjective experience. But I wonder how accurate your assessment over your claims was, for it's still eerily similar to the zombie problem, so much that I'm even tempted to consider those as freudian slips on your part. He did not become conscious only when he re-acquired the ability to make material decisions. Ah, yes, that reminds me, you might prefer to consider consciousness as only the ability of perception. I still wouldn't go with your assessment, though, for anything that can not both perceive AND alter is not conscious. (Also, it's a lucky leap for you, since I never said anything earlier about making "material decisions", nor anything that can directly lead you to it. I might as well have not believed it at all.) Side: no
that experience is based on observable material phenomena and thus the problem is invalid The theory that consciousness is based on observable material phenomena is useful, but the fact that consciousness itself is completely subjective, we are barred from directly observing another’s conscious experience. Thus, we are left to a non-testable, non-falsifiable ideas about consciousness. No brain scan can tell you what red looks like to me. Thus, the hard problem persists. I fully expect you to ignore this point as it opposes you faith in materialism. Though you say that, if something is able to portray autonomous decision making, even your unreliable instincts would say that it is conscious, as is evident from your criteria. When possible, I'd prefer to avoid such 'I'll know it when I see it' approaches, unlike you I didn’t say I would know it when I see it, I said it is reasonable to conclude that animals have consciousness. Not based solely on decision making, but on the totality of the circumstances. The ever-advancing decision-making computer technology indicates that I may very well think I am seeing consciousness, and think I know it, but there is no consciousness there. Though I won’t know there isn’t for sure because I cannot know for sure whether a robot is experiencing it’s processes. consciousness refers to the ability to perceive and alter one's surroundings I’m glad you’ve finally admitted that I am right. We can observe a thing altering its surroundings but we cannot observe that a thing perceives. As I have illustrated already, a thing can lack the ability to alter its surroundings, but nonetheless perceive. Though an observer would never know it. Thus, observing a thing altering its surroundings is necessary to know it is conscious, but not sufficient since we cannot observe it perceiving. though, for anything that can not both perceive AND alter is not conscious Tell it to the victims of Anesthetic Awareness, whose disastrous experience invalidates your opinion. Side: yes
1
point
No brain scan can tell you what red looks like to me. You seem overly confident about that. Tell it to the victims of Anesthetic Awareness, whose disastrous experience invalidates your opinion. I probably won't, but you can tell it to them posing as me if you prefer. With all that, you've just managed to prove that I can not claim certainty on it. That was a bit redundant, I'd say, for you could simply get me to say that directly if you preferred to - you know that I don't claim certainties, after all. I might as well have made you prove it now. I wonder how certain you are, though. As I have illustrated already, a thing can lack the ability to alter its surroundings, but nonetheless perceive. Or, of course, the opposite. I see no reason to favour either case over the other as consciousness. I didn’t say I would know it when I see it, I said it is reasonable to conclude that animals have consciousness. Not based solely on decision making, but on the totality of the circumstances. The ever-advancing decision-making computer technology indicates that I may very well think I am seeing consciousness, and think I know it, but there is no consciousness there. Though I won’t know there isn’t for sure because I cannot know for sure whether a robot is experiencing it’s processes. In other words, it's an unreliable instinct, for you want to judge it on a per case basis on the totality of circumstances. That's just the 'I know it when I see it' behaviour. but the fact that consciousness itself is completely subjective, we are barred from directly observing another’s conscious experience. Subjective? Ah, no, it's the basis of subjectivity, so that'd be too... off the mark. Any subjective phenomena, such as free will, can conclude consciousnes, as I said. Thus, we are left to a non-testable, non-falsifiable ideas If that worries you, then you shouldn't attempt at any metaphysical aspects related to it. Side: no
You seem overly confident about that I'm exactly as confident as the absolute nature of the situation calls for. You stating that I make bold claims or that I am too confident is not reflective of the actual argument I make. Rather it is reflective of your inability to accept my argument due to your own shaky philosophical foundations. With all that, you've just managed to prove that I can not claim certainty on it You cannot claim certainty on any test that is as insufficient as yours. I would seek to increase my level of certainty, but since you find uncertainty to be a fundamental certainty, I suppose you would accept all manner of insufficient tests. In other words You follow this with other words that are not what I said, but is rather a different position that you suppose is easier to attack. You previously did the same when you attempted to "reconstruct" my claim into something you prefer to target but is wildly irrelevant. I am not concerned by the fact that there is no test for consciousness that is 100% certain, that has never been my point. The problem is that your test is insufficient. It's as though you claimed that you know a car by whether it has wheels, and then claim that my position is too "case by case" because I take the more nuanced view of considering the engine in addition to the wheels. Since I have provided several ways in which your test is insufficient, and since you are boring me again, I am gonna stop here. I'll let you have the last word again if that makes you feel better. Side: yes
1
point
but since you find uncertainty to be a fundamental certainty, I suppose you would accept all manner of insufficient tests. No, it just has to be certain enough to be valid. I'd prefer it to be as certain as possible, but I still don't claim absolute certainty for any such things. but is rather a different position that you suppose is easier to attack. You previously did the same when you attempted to "reconstruct" my claim into something you prefer to target but is wildly irrelevant. Then you should make your claims more precise, since you seem to have trouble in that. If you're confused on my reconstruction about any specific point, you can, as always, ask for clarification about that. Or you can even try understanding it by yourself, if that may suit your tastes. It's as though you claimed that you know a car by whether it has wheels, and then claim that my position is too "case by case" because I take the more nuanced view of considering the engine in addition to the wheels. I wonder whether that could be called an efficient rhetoric if I didn't know what the talk was about. The fact that you were criticising such things a few sentences ago makes it rather comical. But anyway, it's more like I'm setting criteria for knowing whether something is a car, and you're saying that any such thing is impossible and we should evaluate it on a per case basis with the totality of circumstances around it. Rather than showing how my analogy fails, you just contented yourself with blaming me for giving it, as if that was your sacred belief. But, that's unsurprising. and since you are boring me again, I am gonna stop here. I'll let you have the last word again if that makes you feel better. I don't mind it either way, but you seem a bit too impatient to try to mimic me. Now that it think of it, it must have been insulting, for me to say that you're arguing just because you want the last word (and then not even let you have it). Side: no
1
point
1
point
emotions? you seem to forget that not all decisions are made by logic. either way, the robot is limited to the intelligence we provide it with. decision making depends on the fundamental code lines we provide it with. We technically make it behave in a certain way and just provide it with rules it must abide by. And the decisions are made based on those fundamental rules we ask it to abide by. Side: yes
1
point
1
point
Robots do not completely possess what humans do, and hence we can expect different decisions.. (not in all cases, but in most) even if the outcomes of the decisions made are same, the way those decisions have been made are different( since we know it), we are just talking about two different sets of causes that lead to a similar outcome, either way, I can be pretty sure about the decisions not being same for many cases.. and hence, the assertion of robots that will definitely always show similar decisions to humans fails. Side: yes
2
points
Even using determinism, we are one of the causes of the existence of that robot, and hence we can provide it only with what is available to us, making the decisions made by the robot a subset of all the decisions a man can make.. that is very much why I think robots can't make any decisions that a man cannot. So the decisions made by the robot is very much limited because of us.. in other words it is not "free" to make decisions, emotions could have been a factor that would allow it to make free decisions of it's own, but since they only function on logic, they are bound to it. Side: yes
2
points
Emotions are simply another variable. Emotions are value driven, and robots don't have values. While we cannot be sure of simulated emotions (because we cannot know if they consciously feel them) we can be sure of simulated values. Given enough variables, including multiple and conflicting simulated values, robots would appear conscious. Everyone behaves in various ways for reasons. We all have rules we abide by. We have a large number of variables, including emotions, that we weigh and balance when we make decisions. This isn't what makes us conscious. Our experience of the process is what makes us conscious. Conscious decision making simply means experiencing the process. So far, robots don't appear to do that. Side: yes
2
points
Everyone behaves in various ways for reasons. We all have rules we abide by. we possess the ability to break rules, robots do not, and that accounts a lot to whether a robot can ever appear to have consciousness, the have to abide by rules, which restricts them from making many decisions,making it more predictable. unlike humans, who can break rules. Side: yes
1
point
yes, agreed, emotions are a reason , robots don't possess them, and we are the reason (one of many) to the existence of robots, and they are the reason (one of many) to the decisions they make, so it appears to me that I've made that decision indirectly, not that the robot is making it independently. If you appear to do things without reason, that simply means you are suffering a mental illness or injury, and your reasons are unknown to you. lol one can only be unknown to reasons, and the very fact that mental illness is present, serves as a reason to the decisions made, so again, it doesn't appear to me that people with mental illness do things without a reason. Side: yes
Robots are caused to make certain decisions. That is to say they make decisions because of causes. You make decisions because of causes too. We call human causes reasons. The fact that your decisions are caused does not mean that you do not independently make them. The fact that robot decisions are caused does not mean that they do not independently make them. Nonetheless, you have consciousness because you experience your decision making process where robots do not. Thus, they make decisions but do not have consciousness. Side: yes
1
point
Robots are caused to make certain decisions. That is to say they make decisions because of causes. You make decisions because of causes too. We call human causes reasons. The fact that your decisions are caused does not mean that you do not independently make them. The fact that robot decisions are caused does not mean that they do not independently make them. Nonetheless, you have consciousness because you experience your decision making process where robots do not. Thus, they make decisions but do not have consciousness. well said, but the faculty used by humans to make decisions is way different from that of robots, and what I'm saying is that the decisions made by robots do not appear (as you said ) to be similar to that of humans, simply because I know how they function, and make decisions. If robots were a kind of their own, and I were to see them as we see animals, not knowing how they make decisions, certainly it would appear to me that they make "independent" decisions. Thus, they make decisions but do not have consciousness. they make decisions, but I'd not call them independent. the decisions made are dependent on how it is programmed to make decisions. And let's say I don't program it for some specific purposes, such as that of the ability of asking the question word "where", and program it to never learn so, and not even use any alternatives of that very word, irrespective of how advanced it is, its decisions are affected, simply due to the way I've programmed it. So it is dependent(decisions) on what we provide it with. Side: no
well said, but the faculty used by humans to make decisions is way different from that of robots, and what I'm saying is that the decisions made by robots do not appear (as you said ) to be similar to that of humans, simply because I know how they function, and make decisions You make decisions based on your genetic code and your experience. Computers make decisions on their binary code and, more and more, their experience. We are coming to the point where you can interact with a computer online and believe you are talking to a person. Within two lifetimes may be able to interact with robots directly and not know they are not people. Knowing how a computer is programmed won’t help you determine who is a computer and thus will not help you determine if they are conscious. If robots were a kind of their own, and I were to see them as we see animals, not knowing how they make decisions, certainly it would appear to me that they make "independent" decisions Does it matter that your code was formed over thousands of years while their code was formed over decades? No. The source of your programming is not the determining factor in whether your choices are independently made. But more to the point, whether you understand computer programming or not, decision making is not sufficient to know consciousness. Another, more biological, example of why this test fails is the rare but real condition of Anesthetic Awareness wherein the patient cannot make any observable decisions but is nonetheless very conscious. And let's say I don't program it for some specific purposes, such as that of the ability of asking the question word "where", and program it to never learn so, and not even use any alternatives of that very word, irrespective of how advanced it is, its decisions are affected, simply due to the way I've programmed it Lets say there is a whole range of light that you are never programmed to see, such as UV light. Your decisions are affected, simply due to the way you are programmed. Side: yes
1
point
Lets say there is a whole range of light that you are never programmed to see, such as UV light. Your decisions are affected, simply due to the way you are programmed. now you're trying to prove that a truly independent decision can never be made, the kind of decisions we make are sure, bound to our limitations, but it's up to us what we need to call "independent", in which case it would be reasonable to call the decisions we make "independent", we know we've programmed the robot, and you seem to use the term programmed for humans.. what shall I conclude of this?, is it that you believe in some entity that "programmed" us? You make decisions based on your genetic code and your experience and Computers make decisions on their binary code and, more and more, their experience. the way we experience things is different from the way robots do, (I don't know whether the word experience actually does justice there, as I thinkrobots rely on "past data" would be a better sentence to use, simply because of the definitions of the word experience..) and until I know that fact, I know very much about how they make their decisions, thus making it more predictable and dependent. If I were to take the highly advanced code of a robot and put it into another computer or robot, It would lead to the same decisions made on similar cases, and I don't see how this would be possible in the case of humans, and even if it were to be possible, emotions would certainly make a difference, as there are many cases in which one finds himself/herself in conflict and choose a random decision out of the available two.. The source of your programming is not the determining factor in whether your choices are independently made it is, to make an assertion saying that robots make independent decisions, by "independent", we refer to the kind of decisions humans make, which we choose to call "independent". And because the way decisions are made are so different,(your assertion of genetic code and binary code) we cannot call the robot "independent" simply because we choose to call the genetic code decisions as independent and as of your previous assertion, there isn't any independent assertion as such if you actually look into the causal P.O.V, but you have used it, why did you use it in the first place? stick to what we're debating on. Side: no
1
point
1
point
1
point
1
point
1
point
1
point
1
point
1
point
Of course independent decisions can be made. Humans, animals, and computers can all make them. The apparent argument against independent decision has arisen merely out of your failure to destinguish a computers caused choices as dependent, from human caused choices as independent. Both are independent insofar as the choice is a product of the quality of the individual unit. Both are dependent insofar as they are based on causes. Your choices are caused. No less than a computer. Adding emotions to the variable set doesn't make your decisions more independent, it simply adds one more thing for your decisions to depend on. I need not appeal to an organizer to see that the universe is organized. I need not appeal to a programmer to see we are programmed. My argument is not that computers make the kind of decisions humans make. I consider animals to have consciousness, though cats don't make the kind of decisions humans make. So no, the origin of your program is not the determining factor for whether a decision is independent. Some points left unaddress: 1.Observed decision making is not a sufficient test for consciousness because knowing that a computer does not make decisions like you is irrelevant so long as you don't know whether you are talking to a computer (The Turing Test was passed in 2014). 2. Observed decision making is not a sufficient test for consciousness because it fails the phenomenon of anesthetic awareness, wherein consciousness is independent of any observable decision making. Side: yes
1
point
Your choices are caused. No less than a computer. you say this and say all decisions are independent? My argument is not that computers make the kind of decisions humans make but human decisions are what we refer to as independent. .Observed decision making is not a sufficient test for consciousness because knowing that a computer does not make decisions like you is irrelevant so long as you don't know whether you are talking to a computer (The Turing Test was passed in 2014). your assertion on this is not my problem, the problem I have is of using robots as an example. Observed decision making is not a sufficient test for consciousness because it fails the phenomenon of anesthetic awareness, wherein consciousness is independent of any observable decision making. okay, fair enough, but can you experience the way an animal does? does that mean you want to stick to believing that animals have consciousness? (which is a horrible assertion on it's own, which I've mentioned why in my arguments with jatin) Side: yes
you say this and say all decisions are independent? Decisions are independent insofar as the choice is a product of the quality of the individual unit. Decisions are dependent insofar as they are based on causes but human decisions are what we refer to as independent. No, they are what you refer to as independent. Whether it is limited to human decisions is the point of contention. I don't know whether you think cat decisions are independent. your assertion on this is not my problem, the problem I have is of using robots as an example My assertion is a problem for you insofar as it renders your problem with my example invalid. Address this assertion, and you address my argument against your problem. okay, fair enough, but can you experience the way an animal does? No. Neither can I experience the way you or anyone else does. does that mean you want to stick to believing that animals have consciousness? Since I have every reason to believe that animals have consciousness, yes. Since I have no more than one reason to believe robots do or can, I will withhold such belief. If you find this belief unfounded, you'll have to challenge it directly with me. I won't be reading your arguments with Jatin. Side: yes
1
point
"Decisions are independent insofar as the choice is a product of the quality of the individual unit. " and even after that you don't wanna call human decisions independent? which makes this assertion true: "but human decisions are what we refer to as independent." I don't know whether you think cat decisions are independent. I for sure know robot decisions are not, and even you know it. Robots don't have "qualities" to make choices, (quality:a distinctive attribute or characteristic possessed by someone or something.) attributes are not distinctive as robots have nothing new, working to their fundamentals(which can be manipulated by us, and it's a choice of humans whether to make all robots similar or not, and anything added to it's intelligence is "external" and not something possessed on it's own).Let's just hope you don't rant about physical qualities that have no connection however in decision making. No. Neither can I experience the way you or anyone else does. that makes observation of effects and making theories way more feasible than just to day dream. Since I have every reason to believe that animals have consciousness, yes. the word consciousness refers to what "we" experience and not what animals experience simply because we cannot experience the way they do. And hence by saying that animals experience just the way we do(to say they have consciousness), fails your argument , simply because we cannot know, which makes your assertion nothing more than a baseless belief, and logically speaking animals are not capable of all that we are capable of, hence they can't experience things in our way of experiencing it. Side: yes
and even after that you don't wanna call human decisions independent? Please don't misrepresent my position. Decisions are independent in one context and dependent in another, as I explained. The principle that makes them dependent or independent apply to decisions of all sorts, including human decisions. but human decisions are what we refer to as independent You have a nasty habit of defining words to specifically mean what you are arguing for. As if claiming that what birds do is what we refer to as flight, thus bugs don't fly. It's absurd. I for sure know robot decisions are not (independent) Well you haven't argued very well against my claim to the opposite. Other than attempting to define my position away with creative new definitions. Saying you know it, but being unable to argue for it, does not work on a debate website. and even you know it Given the discussion we have been having, this makes me think you have no insight into even the most explicit and apparent conscious states of others. Obviously I know different. Robots don't have "qualities" to make choices Sure they do. Siri is not Alexa and neither are Bixby. They all have different qualities. Let's just hope you don't rant about... If you don't like being unable to counter my apparently long-winded arguments, you can disengage. It will be more productive for you than hoping I will. the word consciousness refers to what "we" experience and not what animals experience simply because we cannot experience the way they do. There is that nasty habit of making up defining criteria again. Consciousness is what I refer to what I experience and not what you experience simply because I cannot experience the way you do. Does that new definition I just decided on work for you? If so, I don't think you're conscious. If not, neither does your definition work for me. And hence by saying that animals experience just the way we do(to say they have consciousness), fails your argument You will really need to work on not misrepresenting my position if you want to properly counter it. I never said that animals experience just the way we do. I even said I don't experience just the way you do. Thus, making up definitions on your own terms and then applying it to a mis-reading of my position fails argumentation as such. Try again. Side: no
1
point
Please don't misrepresent my position. Decisions are independent in one context and dependent in another, as I explained. oh so you have brought up the context game...? 1) logically, if one were to look at things through the causal view, nothing would technically be "independent". 2) in which case you shouldn't have used the term "appears to be independent", what appears is also subjective, to you it appeared to be "independent" and to me it didn't, and I am talking about "why" it appears to me as dependent, which you seem to foolishly ignore, as perspectives can be different, let me give you an analogy: magic tricks appear to be in a certain way when we don't know how they are performed and once you know the secret to it, they are not "magic" tricks anymore, they are just illusions to the audience that doesn't know the secret to the trick, your failure to talk upon this argument of mine shall be concluded as you are trying to ignore a pitfall. You have a nasty habit of defining words to specifically mean what you are arguing for. As if claiming that what birds do is what we refer to as flight, thus bugs don't fly. It's absurd. I remember someone talking about contexts one argument ago. It's absurd. Well you haven't argued very well against my claim to the opposite. Other than attempting to define my position away with creative new definitions. what else can I do when you choose to ignore / not understand what I'm saying. You will really need to work on not misrepresenting my position if you want to properly counter it. I never said that animals experience just the way we do. I even said I don't experience just the way you do. Thus, making up definitions on your own terms and then applying it to a mis-reading of my position fails argumentation as such. Try again wth? you said animals have consciousness. The word has been defined for something that we experience, believe it or not, that's logical, and to use it so blatantly is horrendously idiotic in my opinion, for an otherwise expected person to be able to uncover this. There is that nasty habit of making up defining criteria again. Consciousness is what I refer to what I experience and not what you experience simply because I cannot experience the way you do. Does that new definition I just decided on work for you? If so, I don't think you conscious. If not, neither does your definition work for me. this my friend, has proved you have no idea of what I'm talking about and hence I see the ranting.. consciousness is different for everybody you say, but what we all experience as consciousness is similar (not the same), i.e, we see , we hear, we smell and so on.. I'm not talking about the subject of experience, but the kind of experiences we can have... (please note we're talking about fully functioning humans), the way they are supposed to be. Sure they do. Siri is not Alexa and neither are Bixby. They all have different qualities. siri is not an individual, but on many devices, If I were to talk about "siris" on more than one device, things wouldn't be distinctive.... and the meaning of "quality" is: a distinctive attribute or characteristic possessed by someone or something. this is the last explanation I'll probably be giving you, things have gone too far out and you're speaking just to defend yourself rather than accepting what's right. Side: yes
1) logically, if one were to look at things through the causal view, nothing would technically be "independent". 2) in which case you shouldn't have used the term "appears to be independent" 1. in one respect this is the case, but not in another. As I said. 2. In which case "appears" is exactly the term I should have used given, whether you consider all decisions to be independent or not, they certainly appear to be, even with some robots. what appears is also subjective, to you it appeared to be "independent" and to me it didn't, and I am talking about "why" it appears to me as dependent, which you seem to foolishly ignore Didn't ignore, I countered with my example of the Turing Test, which you explicitly ignored, presumably because it fully makes my point. I'll use your analogy below to make the same point again: magic tricks appear to be in a certain way when we don't know how they are performed and once you know the secret to it, they are not "magic" tricks anymore, they are just illusions to the audience that doesn't know the secret to the trick, your failure to talk upon this argument of mine shall be concluded as you are trying to ignore a pitfall Even if you know everything there is to know about magic and how it works, you cannot explain a trick that you cannot identify as a trick. Thus, knowing everything about computers won't help you recognize a lack of consciousness if you believe you are talking to a person (Turing Test). This is why observing decision making is a poor test for consciousness. It doesn't matter if computers make decisions, if those are independent, or if they make no decisions. So long as they can appear to make decisions (and fool judges into thinking so), the test is insufficient. I remember someone talking about contexts one argument ago. It's absurd In my post I illustrated your absurdity. In you post you claimed that I was absurd with no indication of such. A notable difference between you and I. what else can I do when you choose to ignore / not understand what I'm saying I have consistently understood and addressed what you are saying. I can't help it that you are wrong. you said animals have consciousness. The word has been defined for something that we experience Yes. Erroneously defined by you. The debate is not whether animals interact with the English language, its whether they have consciousness. If anyone (other than you) thought that consciousness was actually a strictly human quality by definition, we wouldn't have this debate. You could quote the definition and be done with it. As it stands, consciousness is defined as a state of awareness. This definition is not limited to humans except by you, and only for the purposes of lazy debating. Since your definition is unique to you, go about your business. The rest of us will be over hear using common definitions and reason. consciousness is different for everybody you say, but what we all experience as consciousness is similar (not the same), i.e, we see , we hear, we smell and so on.. I'm not talking about the subject of experience, but the kind of experiences we can have... Since consciousness refers to experience as such, and not to specific kinds of experience, you should try to stay on track. So far you have displayed confusion concerning the nature of consciousness and concerning the nature of decisions. You like to define things in your own terms and attack your misreading of my position. When you cannot argue against me you pretend the key point is irrelevant. When you still fail to make a valid argument, you arbitrarily claim to be right by definition. You said "you're speaking just to defend yourself rather than accepting what's right". This would hold more merit if I wasn't actually right. Address the Turing Test above and you may see the validity of my position. I'm sure you've ignored it for this long for a reason. Side: yes
1
point
We have a large number of variables, including emotions, that we weigh and balance when we make decisions yes, and when we talk about a robot, the only variables become the conditions we've set up for them, certain variables such as emotions are not present, hence it doesn't appear to have consciousness (the way we experience life), if you take into consideration of the above mentioned point. Our experience of the process is what makes us conscious this I very much agree upon, and hence we know what it is like to be conscious, and we also know the case for robots while all they do is execute commands and check for contradiction and what to do if there is one, so again, it doesn't appear that they possess consciousness. Side: yes
all they do is execute commands and check for contradiction and what to do if there is one, so again, it doesn't appear that they possess consciousness I suggest you take another look at the current state of computer technology. Programs appear more and more like they have consciousness. Yes this is by design. They simulate emotions and values and a seemingly organic decision making process I suggest you take another look at the current state of computer technology. Computers appear more and more to have a consciousness. I don't mean to say that they do have consciousness, but that they appear to. They simulate emotions, values, and a seemingly organic decision making process. Thus, you cannot use decision making as the crux of consciousness. The fact that computers run on programs is not sufficient to dismiss their decision making. People run on programs. Ever touch a hot stove? We weigh more variables than computers (for now), but just as computer decisions are caused, so are ours. Side: yes
1
point
I suggest you take another look at the current state of computer technology. Programs appear more and more like they have consciousness. this is what I've been trying to explain you all this while, If one chooses to ignore the fact that those are programs being executed there, irrespective of how complexly written, it only then appears as if they have a consciousness. If I know the fact that those are programs being executed there, it no longer appears to me that it is consciousness. The fact that computers run on programs is not sufficient to dismiss their decision making. People run on programs. Ever touch a hot stove? We weigh more variables than computers (for now), but just as computer decisions are caused, so are ours. I must remind you that consciousness is a way of experiencing things, (the way we do), which in that sense makes it impossible for us to use the robot example. I don't mean to say that they do have consciousness, but that they appear to. They simulate emotions, values, and a seemingly organic decision making process. Thus, you cannot use decision making as the crux of consciousness. this I very much agree upon, as you can see in my previous arguments in this very debate, the issue I have is of using robots as an example. Side: yes
If one chooses to ignore the fact that those are programs being executed there, irrespective of how complexly written, it only then appears as if they have a consciousness. Which means that observing an entity making independent decisions is not a good test for consciousness. Robots can appear to make independent decisions. The fact that humans are the ultimate cause of their ability (programming) to make decisions does not eliminate the fact of the decision-making process. You are not the cause of your own ability to make decisions. You did not create the circuitry of your brain nor the stimuli you are compelled to respond to. Your independent decisions are still independent despite something else being the ultimate cause of your variables. The same will soon be the case for computer programs. I must remind you that consciousness is a way of experiencing things, (the way we do), which in that sense makes it impossible for us to use the robot example. Please just scroll up and see why I used the robot example. It was to illustrate that robots do not have consciousness but can appear to, thus invalidating the decision making test as the test for consciousness. The Turing Test was passed in 2014 by Eugene Goostman, though Eugene is not conscious. Side: no
1
point
Which means that observing an entity making independent decisions is not a good test for consciousness. this I can't pick sides upon, after all, that's the only thing we can do, observe and make theories from what we observe... Robots can appear to make independent decisions. no, i have explained it in the other reply.. The same will soon be the case for computer programs which depends on the way things are programmed. It was to illustrate that robots do not have consciousness but can appear to, thus invalidating the decision making test as the test for consciousness. I shall expect you to look into my explanation in the other reply.. You are not the cause of your own ability to make decisions. You did not create the circuitry of your brain nor the stimuli you are compelled to respond to but we have programmed the robot. that's why I say it's not good to use it to portray independent decision making. Side: yes
1
point
1
point
1
point
You won't be getting that by asking them about it. The best you can do is to test it by it's effects, as I mentioned there. Without the effects of consciousness, it'd be a meaningless and redundant concept anyway. Without the effects of consciousness when we talk about consciousness, the very term refers to what we experience. to say that they do have "consciousness", is to say that they experience something just the way we do. animals do respond to external stimuli without anyone commanding it to respond in a particular way, but a human, depending on the circumstances, may respond differently, to the very same stimulus.. and so I think that shouldn't be an explanation. Side: yes
1
point
If everything were responding in the same way to the same thing and thus representing automated responses, I'm afraid I couldn't count it as conscious. So I don't see how that should be a problem. I believe your only other option than testing it out would be making a guess, which would depend on whether your personification bias is stronger/weaker than your ethnocentric bias. With such claims, we can compare ancient cultures (in this case, Greek vs Indian). As I said, you can't get an answer by asking animals. Religious people prefer to believe in the guesses of people from ages ago. Side: no
1
point
If everything were responding in the same way to the same thing and thus representing automated responses, I'm afraid I couldn't count it as conscious. So I don't see how that should be a problem. nope, your assertion relates decision making and consciousness, and the word by itself is what we experience, and if animals have "consciousness" you are referring to what we experience and hence the decisions should be similar, that's the pitfall here. I believe your only other option than testing it out would be making a guess, which would depend on whether your personification bias is stronger/weaker than your ethnocentric bias. With such claims, we can compare ancient cultures (in this case, Greek vs Indian). what I'm saying is of making theories. we can see emotions in animals, such as that of mother instincts are very profound and cannot be denied.. so while their brain is capable of emotions, it isn't for logic and rationality, so that part of consciousness is what's probably missing out from them, as we can question ourselves, and they can't.. so that leads to a lot of information that makes a sense of self being absent in animals, so I'd rather say they have a sort of consciousness, which is different from what we possess. As I said, you can't get an answer by asking animals. Religious people prefer to believe in the guesses of people from ages ago. If I could ask the animals, I wouldn't be arguing about the issue right now, and I see no problem in making theories. Side: yes
1
point
what we experience and hence the decisions should be similar, that's the pitfall here. I never said anything about decisions being similar, and even after I clarified why I wouldn't have, you seem stuck on it. we can see emotions in animals, Another one of the effects of consciousness that you can not directly hope to measure. that of mother instincts are very profound and cannot be denied.. Every chemical reaction has the instinct to keep going at specific rates. So? so while their brain is capable of emotions, it isn't for logic and rationality, A leap that is neither necessary nor reasonable. (It's also false, but even that's unnecessary.) that part of consciousness is what's probably missing out from them, as we can question ourselves, and they can't.. You're conclusing far too much from some actions that remotely resemble emotions in animals and some that don't in some humans. By this point, it's as good as just guessing. they have a sort of consciousness, which is different from what we possess. Sort of. As I said in the beginning, I'd rather not dump all animals in a single category of consciousness. If I could ask the animals, I wouldn't be arguing about the issue right now That makes me think even more that you'd rather prefer to rely on guesswork for this one, for I'd still be arguing about whether it'd still be a reasonable conclusion (there are more things to communications than just some words and sounds). Side: no
1
point
I never said anything about decisions being similar, and even after I clarified why I wouldn't have, you seem stuck on it. yes, but you related decisions and consciousness. And what I meant there was because we talk about animals having consciousness, we are referring to what consciousness means to us, and hence we should be expecting similar decisions to that of humans. that of mother instincts are very profound and cannot be denied.. that was to secure the fact that animals have emotions, since many even question that. so while their brain is capable of emotions, it isn't for logic and rationality, A leap that is neither necessary nor reasonable. (It's also false, but even that's unnecessary.) they might have logic to some basic survival extent, but nowhere comparable to humans, and that clearly shows the difference between the brains of humans and animals, making it very clear that a "sense of self" should be different for them than the way it is for us. You're conclusing far too much from some actions that remotely resemble emotions in animals and some that don't in some humans. By this point, it's as good as just guessing. certainly not, when our brains are so different from animals, it is enough to say that they can't experience things the way we do, and the word "consciousness" refers to what we experience, and hence to say they have "consciousness" would be wrong. That makes me think even more that you'd rather prefer to rely on guesswork for this one, for I'd still be arguing about whether it'd still be a reasonable conclusion (there are more things to communications than just some words and sounds). It is reasonable, if you look into what I'm actually saying. Sort of. As I said in the beginning, I'd rather not dump all animals in a single category of consciousness. in fact, no animals should be in the category of "consciousness". There should either be another word created or we should stick to calling it as sort of consciousness or maybe something like consciousness. Side: yes
1
point
And what I meant there was because we talk about animals having consciousness, we are referring to what consciousness means to us, in fact, no animals should be in the category of "consciousness". There should either be another word created or we should stick to calling it as sort of consciousness or maybe something like consciousness. Considering you thought that much and how oblivious you are to such things here, It is reasonable, if you look into what I'm actually saying. You haven't thought the question through well enough. we are referring to what consciousness means to us, and hence we should be expecting similar decisions to that of humans. You're expecting similar decisions and emotions; I'm just saying that there must be something there. but nowhere comparable to humans, Humans that have went through millions of years of civilisation and prevented other animals from doing so in that time. It's rather ironic you seem so blind toward that while claiming human superiority in intellect. Unlike the Jurassic era, most animals today have a body design that could allow them to be civilised. But none other evolved so weak that it had to turn to hunting-gathering in groups and agriculture, and thus been under the pressure to become intelligent. It still isn't long enough that humans be 'inherently intelligent'. in fact, no animals should be in the category of "consciousness". There should either be another word created or we should stick to calling it as sort of consciousness or maybe something like consciousness. So, you can define consciousness so objectively and well and it may include only humans? Either that's too ethnocentric and circular, or it's rather unprecedented. Side: no
1
point
You're expecting similar decisions and emotions; I'm just saying that there must be something there. I'm not, it's you who brought up decisions being basis of consciousness,I was trying to explain you why I think it's wrong and the second part of the argument I totally agree with, as I have mentioned in my first argument of this debate. Humans that have went through millions of years of civilisation and prevented other animals from doing so in that time. It's rather ironic you seem so blind toward that while claiming human superiority in intellect. not all animals were prevented from civilizing, as there are jungles filled with animals and only we seem to have civilized.. It still isn't long enough that humans be 'inherently intelligent'. I agree, a kid brought up in the wild without any human interaction is going to be very less like any kid brought up in a "civilized" way, but that in no means supports to why humans aren't currently the most intelligent species. But none other evolved so weak that it had to turn to hunting-gathering in groups and agriculture, and thus been under the pressure to become intelligent evolved so weak? better think about that, we've evolved to become more efficient and resourceful. So, you can define consciousness so objectively and well and it may include only humans? if I knew how animals experience, I wouldn't have put that statement up, if you do, you are most welcome to either call it consciousness or give another name, like you said, asking them isn't going to help us. Side: yes
1
point
I'm not, it's you who brought up decisions being basis of consciousness, Even after I said it twice... you're trying that again. You know, right, that basis is very different from effects (the thing that I said)? not all animals were prevented from civilizing, as there are jungles filled with animals and only we seem to have civilized.. That's not how it works. Humans evolved advanced enough to dominate over other species, and the way it works, for humans to stay in power, all other species were prevented from getting close. It's the same as what happened about dinosaurs. evolved so weak? better think about that, we've evolved to become more efficient and resourceful. All that just to make up for the irredeemably weak strength. Otherwise, nothing of that sort could have evolved. if you do, you are most welcome to either call it consciousness or give another name Me? I can, sure, but I had already made that point in the beginning. If they can represent free will, then they are conscious. There really isn't much more to it than stages of consciousness. Side: no
1
point
Humans evolved advanced enough to dominate over other species, and the way it works, for humans to stay in power, all other species were prevented from getting close. and then this: All that just to make up for the irredeemably weak strength. Otherwise, nothing of that sort could have evolved. which was a reply to this: better think about that, we've evolved to become more efficient and resourceful. hmm. Me? I can, sure, but I had already made that point in the beginning. If they can represent free will, then they are conscious. There really isn't much more to it than stages of consciousness. which was a reply to: if I knew how animals experience, I wouldn't have put that statement up, if you do, you are most welcome to either call it consciousness or give another name, like you said, asking them isn't going to help us. so you're saying you know what and how animals experience life. Side: yes
1
point
and then this: which was a reply to this: hmm. which was a reply to: So all you can say to that is telling me what I said? That's boring. so you're saying you know what and how animals experience life. I wouldn't put it that way, but with some series of detailed psychological experiments, I can somewhat point the level of a consciousness well enough to guess how it experiences things. It's basically about stacking it's components in order of priority (which also includes individual emotions) and it's ability to alter it's surroundings (as a 'total'). But the test's rather tedious, cruel and unstable, so consider it as good as nothing. For most purposes, after all, you just need to consider how well it compares to humans, and as I said, I have made that point to begin with. (This other method is more useful for comparing to alien lifeforms, if it's completed.) I'd guess that should answer your questions. Side: no
1
point
So all you can say to that is telling me what I said? That's boring. nope, I could've said a lot, but I'd be flogging a dead horse here, and that's why I thought I should see if you realize. And yeah, it is boring, when you're conveying something and people just don't seem to understand it, or even worse, do not want to understand it. So I thought you'd understand what you yourself are saying, but things seem to be different here. I wouldn't put it that way but that was the question I asked. I can somewhat point the level of a consciousness well enough to guess how it experiences things. I might as well assert your argument being nothing more than a guess.. since my proposed idea of making theories, which isn't different from what you're saying, was called a guess. I have made that point to begin with. (This other method is more useful for comparing to alien lifeforms, if it's completed.) The only problem I've ever had is that of using the consciousness in the case of animals so casually, other than that, we are on the same page probably. Side: yes
1
point
but that was the question I asked. I might as well assert your argument being nothing more than a guess.. since my proposed idea of making theories, which isn't different from what you're saying, was called a guess. Things that should have already been clear from the details. Guess I wasted time writing all that. ope, I could've said a lot, but I'd be flogging a dead horse here, and that's why I thought I should see if you realize. And yeah, it is boring, when you're conveying something and people just don't seem to understand it, or even worse, do not want to understand it. So I thought you'd understand what you yourself are saying, but things seem to be different here. Yet all you said is what I said and in reply to what (and some random 'hmmm'). Spare the excuses there, I'd recommend. The only problem I've ever had is that of using the consciousness in the case of animals so casually If all the arguments in this debate seem unconvincing, then enjoy your position on that one. It can't be changed. Side: no
1
point
Yet all you said is what I said and in reply to what (and some random 'hmmm'). Spare the excuses there, I'd recommend. excuses? says the person who calls humans advanced in one argument and disputes the very same in another. If all the arguments in this debate seem unconvincing, then enjoy your position on that one. It can't be changed. nope, there are arguments in this debate convincing, and hence, I can enjoy my position. Side: yes
1
point
I never said anything about decisions being similar, and even after I clarified why I wouldn't have, you seem stuck on it. yes, but you did talk about decisions being the same and why it would lead to a pitfall, and should be "similar" is what I used there, not same, and to be clear upon the context of similar, we as humans by ourselves, can make so many decisions, which in most cases will be different for each person, what I'm talking about is decisions that resemble human decisions, If we had a dog and a man, put a notebook and pen in front of them, and shout random words, one of the decisions the man could make is to write down those words, and while the dog cannot, so that is why the word consciousness fails when we use it for animals.(what I'm talking about is wrt this argument) and the word by itself is what we experience, and if animals have "consciousness" you are referring to what we experience and hence the decisions should be similar, that's the pitfall here. Side: yes
It is most reasonable to assume that animals, who have brains and act as though they have consciousness, in fact do have consciousness. I don't think this is a great revelation as people have believed and denied this idea at various times and places throughout history. Side: yes
1
point
1
point
1
point
They have done experiments to test animal consciousness and their main test subjects they work with are chimps. They found out that the main difference between a chimps consciousness and our own is that they can't comprehend that other living things are experiencing reality from a different perspective. Side: yes
1
point
They found out that the main difference between a chimps consciousness and our own is that they can't comprehend that other living things are experiencing reality from a different perspective. good to know that. consciousness is different for them and us, they have a "chimp" consciousness, while we have consciousness, the way it is.. Side: yes
|
1
point
1
point
well it would have been unconscious before I ate it if I ate it.......I think the one I ate was not the same chicken that was playing the piano. Of course animals have consciousness. That does not make them equal to human, which is usually the illogical leap of logic people who ask the question are trying to make....it's part of the animal rights religion. Side: yes
The keys are lighting before the chicken pecks them...it's just pecking at the lights, it does not know the song. Even if it did know the song, big deal. I've seen a parrot sing Pop! Goes the Weasel", one of the funniest things I remember.....the bird sat dead still silent about three seconds before it's head popped up as it said "Pop!!" Side: yes
1
point
1
point
I guess you think you are just a monkey so eating your dead body would be the same as eating a monkey. hey, do you need help? did you just confess that you are a cannibal? and I certainly don't think I'm a monkey, unlike you, who behaves like one, lacking the faintest pieces of intellect for being rational.. Side: yes
1
point
1
point
1
point
1
point
1
point
Who says there is anything wrong with cannibalism? Some cultures allow it, why shouldn't you? If there is no God to condemn a cannibal, then there is nothing wrong with a devil-monkey which talks and acts like you to eat you alive, or cook your consciousness out of your body before eating it. It is you who has no rational basis of reasoning as you reject your own Maker. Side: yes
Is not the whole point of this debate to establish your belief that a monkey is equal in value to yourself? Monkeys eat monkeys and they don't care if the monkey they eat is conscious or not when they eat it. Are you not promoting yourself as a dirty devil-minded monkey? Side: yes
|