What happens if Robots or AI develop emotions we don’t have?
What happens if Robots or AI develop emotions we don’t have?
Tau.Neutrino said:
What happens if Robots or AI develop emotions we don’t have?
They’d be put down.
Peak Warming Man said:
Tau.Neutrino said:
What happens if Robots or AI develop emotions we don’t have?They’d be put down.
It’s very hard to imagine an emotion that you don’t already possess.
mcgoon said:
Peak Warming Man said:
Tau.Neutrino said:
What happens if Robots or AI develop emotions we don’t have?They’d be put down.
It’s very hard to imagine an emotion that you don’t already possess.
probably also very hard to recognise..
Arts said:
mcgoon said:
Peak Warming Man said:They’d be put down.
It’s very hard to imagine an emotion that you don’t already possess.
probably also very hard to recognise..
Yes, that too.
Tau.Neutrino said:
What happens if Robots or AI develop emotions we don’t have?
Why would a robot or AI develop any emotions at all?
Probably some exotic blends like cheerful misery and ecstatic boredom.
KJW said:
Tau.Neutrino said:
What happens if Robots or AI develop emotions we don’t have?
Why would a robot or AI develop any emotions at all?
Because tinkering scientists would be trying to replicate emotions?
any robot of any intelligence would probably do a lot of head shaking at our antics
AwesomeO said:
KJW said:
Tau.Neutrino said:
What happens if Robots or AI develop emotions we don’t have?
Why would a robot or AI develop any emotions at all?
Because tinkering scientists would be trying to replicate emotions?
Yes, emotions can be instilled into a robot or AI, but my question was why a robot or AI would by themselves develop emotions.
‘2001: A Space Odyssey’ is on tonight.
Affirmative, Dave. I read you.
KJW said:
AwesomeO said:
KJW said:Why would a robot or AI develop any emotions at all?
Because tinkering scientists would be trying to replicate emotions?
Yes, emotions can be instilled into a robot or AI, but my question was why a robot or AI would by themselves develop emotions.
Same answer, I think if they do develop emotions it will be as a product of deliberate efforts to make them feel emotions. I don’t think it will be an accidental thing.
KJW said:
AwesomeO said:
KJW said:Why would a robot or AI develop any emotions at all?
Because tinkering scientists would be trying to replicate emotions?
Yes, emotions can be instilled into a robot or AI, but my question was why a robot or AI would by themselves develop emotions.
If they are able to learn and gain greater knowledge, couldn’t they do the same with emotions?
AwesomeO said:
KJW said:
Yes, emotions can be instilled into a robot or AI, but my question was why a robot or AI would by themselves develop emotions.
Same answer, I think if they do develop emotions it will be as a product of deliberate efforts to make them feel emotions. I don’t think it will be an accidental thing.
It could be an accident in that the creators of the robot or AI have installed behaviour that are precursors to emotions without realising that such behaviour are precursors to emotions due to lack of insight as to what emotions really are. For example, as soon as any avoidance behaviour is installed, then the emotion of fear has been created.
emotions may be considered universals across the species, what makes you think though they’re so-the-same that I couldn’t usefully speculate – with some likelihood of there being a truth about it – that you have emotions I don’t, or I have emotions you don’t.
there are composites at work too, through an individuals life, variously contributing to mental states, they’ve been (and are) a configuring force (of the mind).
for the moment I think emotions are what creatures with oxygenated blood might have.
sugar too, and more, sips coffee.
KJW said:
AwesomeO said:
KJW said:
Yes, emotions can be instilled into a robot or AI, but my question was why a robot or AI would by themselves develop emotions.
Same answer, I think if they do develop emotions it will be as a product of deliberate efforts to make them feel emotions. I don’t think it will be an accidental thing.
It could be an accident in that the creators of the robot or AI have installed behaviour that are precursors to emotions without realising that such behaviour are precursors to emotions due to lack of insight as to what emotions really are. For example, as soon as any avoidance behaviour is installed, then the emotion of fear has been created.
Wouldn’t that be a direct command to avoid certain situations or devices, therefore being devoid of any emotion?
PermeateFree said:
If they are able to learn and gain greater knowledge, couldn’t they do the same with emotions?
Unlike simple knowledge, emotions contain the notion of good and bad. You can’t learn what’s good and what’s bad without some internal reference to desire and avoidance, which are behavioural manifestations of good and bad.
transition said:
emotions may be considered universals across the species, what makes you think though they’re so-the-same that I couldn’t usefully speculate – with some likelihood of there being a truth about it – that you have emotions I don’t, or I have emotions you don’t.there are composites at work too, through an individuals life, variously contributing to mental states, they’ve been (and are) a configuring force (of the mind).
for the moment I think emotions are what creatures with oxygenated blood might have.
sugar too, and more, sips coffee.
Well I don’t know that they are universal. A survival and reproductive mechanism can be fairly simple. Over stimulation of a nerve, move away, something of this colour and shape eat unless it overstimulates a nerve, something of this colour and shape that smells like this fuck.
PermeateFree said:
Wouldn’t that be a direct command to avoid certain situations or devices, therefore being devoid of any emotion?
But if you are avoiding certain situations or devices, how is that not a fear of those situations or devices? Here we encounter the problem of determining the subjective experience of another entity from its objective behaviour.
AwesomeO said:
transition said:
emotions may be considered universals across the species, what makes you think though they’re so-the-same that I couldn’t usefully speculate – with some likelihood of there being a truth about it – that you have emotions I don’t, or I have emotions you don’t.there are composites at work too, through an individuals life, variously contributing to mental states, they’ve been (and are) a configuring force (of the mind).
for the moment I think emotions are what creatures with oxygenated blood might have.
sugar too, and more, sips coffee.
Well I don’t know that they are universal. A survival and reproductive mechanism can be fairly simple. Over stimulation of a nerve, move away, something of this colour and shape eat unless it overstimulates a nerve, something of this colour and shape that smells like this fuck.
they are fairly universal, emotions, Darwin studied them.
plenty hints of them in the bible, folk psychology too.
they’re not a bad starter pack, courtesy evolution, but my point was the are also involved in differentiation, the individuated I.
more I see them as part of homeostatic mechanisms, in a broader way.
KJW said:
PermeateFree said:
Wouldn’t that be a direct command to avoid certain situations or devices, therefore being devoid of any emotion?
But if you are avoiding certain situations or devices, how is that not a fear of those situations or devices? Here we encounter the problem of determining the subjective experience of another entity from its objective behaviour.
If you give a computer a demand and it is programed to do it, how is emotion involved?
https://en.wikipedia.org/wiki/Folk_psychology
In philosophy of mind and cognitive science, folk psychology, or commonsense psychology, is a human capacity to explain and predict the behavior and mental state of other people. Processes and items encountered in daily life such as pain, pleasure, excitement, and anxiety use common linguistic terms as opposed to technical or scientific jargon.
Traditionally, the study of folk psychology has focused on how everyday people—those without formal training in the various academic fields of science—go about attributing mental states. This domain has primarily been centred on intentional states reflective of an individual’s beliefs and desires; each described in terms of everyday language and concepts such as “beliefs”, “desires”, “fear”, and “hope”
transition said:
https://en.wikipedia.org/wiki/Folk_psychologyIn philosophy of mind and cognitive science, folk psychology, or commonsense psychology, is a human capacity to explain and predict the behavior and mental state of other people. Processes and items encountered in daily life such as pain, pleasure, excitement, and anxiety use common linguistic terms as opposed to technical or scientific jargon.
Traditionally, the study of folk psychology has focused on how everyday people—those without formal training in the various academic fields of science—go about attributing mental states. This domain has primarily been centred on intentional states reflective of an individual’s beliefs and desires; each described in terms of everyday language and concepts such as “beliefs”, “desires”, “fear”, and “hope”
I think in an evolutionary sense the above aid the human to survive, but what of the above would a machine need to survive, other than actions based upon facts.
PermeateFree said:
KJW said:
PermeateFree said:
Wouldn’t that be a direct command to avoid certain situations or devices, therefore being devoid of any emotion?
But if you are avoiding certain situations or devices, how is that not a fear of those situations or devices? Here we encounter the problem of determining the subjective experience of another entity from its objective behaviour.
If you give a computer a demand and it is programed to do it, how is emotion involved?
I think emotion requires a certain level of intelligence, and it is the lack of intelligence of the computer that we interpret as the lack of emotion in this case. The point is that emotions aren’t a module that turns a unemotional robot to an emotional robot, but are an emergent property that result when a robot is given objective properties that lead to behaviours that may be interpreted as an emotional response, even if those objective properties were not intended to be emotions.
KJW said:
PermeateFree said:
KJW said:But if you are avoiding certain situations or devices, how is that not a fear of those situations or devices? Here we encounter the problem of determining the subjective experience of another entity from its objective behaviour.
If you give a computer a demand and it is programed to do it, how is emotion involved?
I think emotion requires a certain level of intelligence, and it is the lack of intelligence of the computer that we interpret as the lack of emotion in this case. The point is that emotions aren’t a module that turns a unemotional robot to an emotional robot, but are an emergent property that result when a robot is given objective properties that lead to behaviours that may be interpreted as an emotional response, even if those objective properties were not intended to be emotions.
But even lowly animals have emotions if only to fear predators or to select its favorite food to eat. Am I correct then, that you think emotions will just appear, not unlike life itself?
PermeateFree said:
But even lowly animals have emotions if only to fear predators or to select its favorite food to eat.
I don’t think the intelligence required for emotions is especially high, but even a “lowly animal” is more intelligent than a typical computer program (although this statement requires a proper definition of “intelligence” – a topic for discussion at some other time). But how does one know that lowly animals have emotions? By their behaviour, and this is a point that I have been making: that emotions are associated with tangible notions. For example, the emotion of fear itself seems to be an intangible feeling, but it manifests itself as avoidance which is quite a tangible notion. And sexual attraction manifests itself in a tangible way, even to the extent of distinguishing itself from other forms of attraction.
PermeateFree said:
Am I correct then, that you think emotions will just appear, not unlike life itself?
Yes, I think emotions will just appear if the appropriate conditions exist, just like consciousness (I do not believe in philosophical zombies).
You have to first define what an emotion is. It would seem to a robot something that overides reason.
KJW said:
Yes, I think emotions will just appear if the appropriate conditions exist, just like consciousness (I do not believe in philosophical zombies).
I’m thinking that is taking a very long bow. We have no idea how emotions or intelligence develop. We haven’t even got a decent theory about intelligence. Suggesting that lines of code running on a high speed processor will suddenly develop sentience seems rather fanciful.
tauto said:
You have to first define what an emotion is..
I think I already have to some extent. An emotion is an internal state that ultimately corresponds to the notions of good and bad, as defined by the tangible notions of attraction and repulsion.
tauto said:
It would seem to a robot something that overides reason
Although facts may be devoid of emotional character, survival does involve reason based on emotion. One thing that I should mention is that unless the will to survive is put into a robot, how could a robot possibly learn that it needs to survive?
sibeen said:
We have no idea how emotions or intelligence develop.
The problem is that we will never see emotions or intelligence within the physical componentry of an information processor such as the brain, we will only see the physical componentry itself. This represents a limitation of science to answer such questions, requiring a philosophical approach to go beyond the purely physical. In saying this, I should stress that I’m not invoking anything supernatural here, but rather to expose the category error that would be made if one were to consider emotions or intelligence to be something that can be found in neurons or computer code.
>…./cut/… corresponds to the notions of good and bad, as defined by the tangible notions of attraction and repulsion.
I’m indifferent at the moment.
KJW said:
sibeen said:
We have no idea how emotions or intelligence develop.
The problem is that we will never see emotions or intelligence within the physical componentry of an information processor such as the brain, we will only see the physical componentry itself. This represents a limitation of science to answer such questions, requiring a philosophical approach to go beyond the purely physical. In saying this, I should stress that I’m not invoking anything supernatural here, but rather to expose the category error that would be made if one were to consider emotions or intelligence to be something that can be found in neurons or computer code.
—
The first question I would ask to an AI : how do you think about your impending death?
>The first question I would ask to an AI : how do you think about your impending death?
mine would be how do you feel about what you don’t know
I’d ask it if it wanted some toast…
furious said:
- mine would be how do you feel about what you don’t know
I’d ask it if it wanted some toast…
:-)
Robots Are Developing Feelings. Will They Ever Become “People”?
AI systems are beginning to acquire emotions. But whether that means they deserve human-type rights is the subject of a thorny debate.
more…
www.fastcompany.com
Nao, the first robot able to feel emotions and form bonds with humans that look after it
The first robot capable of developing emotions and forming bonds with humans has been unveiled by scientists.
more…
Tau.Neutrino said:
Nao, the first robot able to feel emotions and form bonds with humans that look after itThe first robot capable of developing emotions and forming bonds with humans has been unveiled by scientists.
more…
Christ, I clicked on that link. I am no longer one of the unsullied.
The Daily Fucking Mail, are you kidding me?
Pepper, the Emotional Robot, Learns How to Feel Like an American
Pepper is about four feet tall, looks like a person (except for the wheels where its legs should be), and has more emotional intelligence than your average toddler. It uses facial recognition to pick up on sadness or hostility, voice recognition to hear concern…and it’s actually pretty good at all that. Over 7,000 Peppers greet guests, answer questions, and play with kids in Japanese homes. And by the end of the year it’ll be on sale in the US—but not before software engineers here get a crack at remaking its soul.
more…
https://www.wired.com/2016/06/pepper-emotional-robot-learns-feel-like-american/
Should we build robots that feel human emotions?
I recently wrote an article for Scientific American called ‘Robots with Heart’. In the piece, I described our work into incorporating an ‘empathy module’ into robots in order for them to better serve the emotional and physical needs of humans. While many readers offered ideas on how we might apply these empathetic robots to medical or other applications, some objected to the very idea of making robots recognize and empathize with human emotions. One reader opined that, as emotions are what make humans human, we really should not build robots with that very human trait and take over the care-giving jobs that humans do so well. On the other hand, there are others who are so enthusiastic about this very idea that they ask me, “If robots are intelligent and can feel, will they one day have a conscience?”
Perhaps it is important for us to understand what it means by robot intelligence and feeling. It is important for us to understand, first of all, how and why humans feel.
more….
https://www.weforum.org/agenda/2016/09/can-we-create-robots-with-human-emotions/
sibeen said:
Tau.Neutrino said:
Nao, the first robot able to feel emotions and form bonds with humans that look after itThe first robot capable of developing emotions and forming bonds with humans has been unveiled by scientists.
more…
Christ, I clicked on that link. I am no longer one of the unsullied.
The Daily Fucking Mail, are you kidding me?
I have no doubt that a robot can be programed to react in certain ways that RESEMBLE emotions, but I don’t think it can ever have emotional feelings. Whatever way you look at it robots are machines that are imputed with instructions on what to do in a certain situations. Their intelligence may be enhanced to levels way above ourselves, but at most they would be governed by logic, not emotions.
“Have you ever been in love?” a stunning 25-year-old woman named Ava asked via Tinder during Austin’s South by Southwest festival. Users who swiped right had the privilege of chatting with Ava by text—only to then receive a message saying, “you’ve passed my test…let me know if I’ve passed yours.” That’s when users realized they’d been hooked by a robot—the star of the Ex Machina, the new movie from writer/director Alex Garland.
more…
http://www.slate.com/articles/technology/future_tense/2015/04/ex_machina_can_robots_artificial_intelligence_have_emotions.html
This bit is interesting
http://www.slate.com/articles/technology/future_tense/2015/04/ex_machina_can_robots_artificial_intelligence_have_emotions.html
>>>>However, some mathematical frameworks suggest that conscious minds process information in context, rather than breaking it down into components the way machines do. Some scientists believe computers aren’t capable of the neural processes that assemble information into a complete picture and thus can’t become conscious. Time will tell, but in the interim, we should consider whether we actually want robots to feel.
Process information in context.
Breaking it down into components.
Assemble information into a complete picture.
Program the robots to do all of that, rather than one or the other.
So if I get this right.
The robots perceive human emotions through visual and auditory means.
The robots are programmed to respond using the same levels of emotion.
There is a difference between programming a robot to respond and a robot that responds by learning using neural networks that draw on large amounts of data looking for patterns.
One group of robots respond by using pre programs.
Another group of robots respond by using machine learning.
That could change quickly.
Movies cited
Her
https://en.wikipedia.org/wiki/Her_(film)
Eva
https://en.wikipedia.org/wiki/Eva_(2011_film)
Westworld
https://en.wikipedia.org/wiki/Westworld_(film)
Ex_Machina
https://en.wikipedia.org/wiki/Ex_Machina_(film)
sibeen said:
Tau.Neutrino said:
Nao, the first robot able to feel emotions and form bonds with humans that look after itThe first robot capable of developing emotions and forming bonds with humans has been unveiled by scientists.
more…
Christ, I clicked on that link. I am no longer one of the unsullied.
The Daily Fucking Mail, are you kidding me?
Quick, have a watch of this:
hard to say
sibeen said:
Tau.Neutrino said:
Nao, the first robot able to feel emotions and form bonds with humans that look after itThe first robot capable of developing emotions and forming bonds with humans has been unveiled by scientists.
more…
Christ, I clicked on that link. I am no longer one of the unsullied.
The Daily Fucking Mail, are you kidding me?
My apologies is The Guardian, ok.
Nao First robot able to develop and show emotions is unveiled
https://www.theguardian.com/technology/2010/aug/09/nao-robot-develop-display-emotions
Humanoid robot NAO as a teaching tool of emotion recognition for children with autism using the Android app
http://ieeexplore.ieee.org/document/7006084/?reload=true
Nao Robot Develops Emotions, Learns To Interact With Humans
https://singularityhub.com/2010/08/17/nao-robot-develops-emotions-learns-how-to-interact-with-humans-video/
Since we’re into another of Tau.Neutrino’s emotions threads, perhaps w few recent observations and questions are in order.
Some people seem to be born without empathy, and with a very limited set of emotions. I now believe that I’ve come into contact with four such people.
Despite feeling emotions very weakly if at all, all four demonstrated curiosity. One seemed to have the ability to feel fear but not sadness. None of the four was a full psychopath/sociopath although at least one had the ability to become one.
What in brain function is involved? I seem to remember seeing on TV that the brains of such people in the criminal system functioned in different parts to those of most normal people.
mollwollfumble said:
What in brain function is involved? I seem to remember seeing on TV that the brains of such people in the criminal system functioned in different parts to those of most normal people.
I don’t know but that last paragraph is a bit disturbing. Not withstanding the existence of free will I can imagine a future of discounted sentences for those that can demonstrate altered brain function as we do now, or in a brave new world almost mirroring sci fi precog, those with a predisposition submitting to electonic surveillance.
mollwollfumble said:
Since we’re into another of Tau.Neutrino’s emotions threads, perhaps w few recent observations and questions are in order.Some people seem to be born without empathy, and with a very limited set of emotions. I now believe that I’ve come into contact with four such people.
Despite feeling emotions very weakly if at all, all four demonstrated curiosity. One seemed to have the ability to feel fear but not sadness. None of the four was a full psychopath/sociopath although at least one had the ability to become one.
What in brain function is involved? I seem to remember seeing on TV that the brains of such people in the criminal system functioned in different parts to those of most normal people.
firstly ‘sociopath’ is not a clinical term in Australia
secondly what is a ‘normal’ person?
thirdly, here is almost no evidence of a ‘full psychopath’ the whole classification is a spectrum that includes core factors still not fully agreed upon by ‘experts’ . The best we have at the moment is a Psychopathy Check List (known as PCL-R) of which there is a screening version (PCL-SV) and a childhood version (PCL-YV), all up for criticism. New research done into the brain and its development, because we now can, is still in its infancy and cannot be verified at this stage.
Arts said:
mollwollfumble said:
Since we’re into another of Tau.Neutrino’s emotions threads, perhaps w few recent observations and questions are in order.Some people seem to be born without empathy, and with a very limited set of emotions. I now believe that I’ve come into contact with four such people.
Despite feeling emotions very weakly if at all, all four demonstrated curiosity. One seemed to have the ability to feel fear but not sadness. None of the four was a full psychopath/sociopath although at least one had the ability to become one.
What in brain function is involved? I seem to remember seeing on TV that the brains of such people in the criminal system functioned in different parts to those of most normal people.
firstly ‘sociopath’ is not a clinical term in Australia
secondly what is a ‘normal’ person?
thirdly, here is almost no evidence of a ‘full psychopath’ the whole classification is a spectrum that includes core factors still not fully agreed upon by ‘experts’ . The best we have at the moment is a Psychopathy Check List (known as PCL-R) of which there is a screening version (PCL-SV) and a childhood version (PCL-YV), all up for criticism. New research done into the brain and its development, because we now can, is still in its infancy and cannot be verified at this stage.
Arts is one of our very special clever people.
I hereby endorse.
AwesomeO said:
mollwollfumble said:What in brain function is involved? I seem to remember seeing on TV that the brains of such people in the criminal system functioned in different parts to those of most normal people.
I don’t know but that last paragraph is a bit disturbing. Not withstanding the existence of free will I can imagine a future of discounted sentences for those that can demonstrate altered brain function as we do now, or in a brave new world almost mirroring sci fi precog, those with a predisposition submitting to electonic surveillance.
to be clear, criminality is not a core feature of psychopathy (as we understand it), knowledge of brain function could be very helpful in treatment for those not into criminological pathways but afflicted by other non social aspects.
Arts said:
AwesomeO said:
mollwollfumble said:What in brain function is involved? I seem to remember seeing on TV that the brains of such people in the criminal system functioned in different parts to those of most normal people.
I don’t know but that last paragraph is a bit disturbing. Not withstanding the existence of free will I can imagine a future of discounted sentences for those that can demonstrate altered brain function as we do now, or in a brave new world almost mirroring sci fi precog, those with a predisposition submitting to electonic surveillance.
to be clear, criminality is not a core feature of psychopathy (as we understand it), knowledge of brain function could be very helpful in treatment for those not into criminological pathways but afflicted by other non social aspects.
If I was your lecturer, you’d be getting top marks.
Arts said:
AwesomeO said:
mollwollfumble said:What in brain function is involved? I seem to remember seeing on TV that the brains of such people in the criminal system functioned in different parts to those of most normal people.
I don’t know but that last paragraph is a bit disturbing. Not withstanding the existence of free will I can imagine a future of discounted sentences for those that can demonstrate altered brain function as we do now, or in a brave new world almost mirroring sci fi precog, those with a predisposition submitting to electonic surveillance.
to be clear, criminality is not a core feature of psychopathy (as we understand it), knowledge of brain function could be very helpful in treatment for those not into criminological pathways but afflicted by other non social aspects.
I suppose it also comes down to is treatment insulting to non social people who aren’t dangerous just different but make others uncomfortable because they aren’t social.
Cymek said:
Arts said:
AwesomeO said:I don’t know but that last paragraph is a bit disturbing. Not withstanding the existence of free will I can imagine a future of discounted sentences for those that can demonstrate altered brain function as we do now, or in a brave new world almost mirroring sci fi precog, those with a predisposition submitting to electonic surveillance.
to be clear, criminality is not a core feature of psychopathy (as we understand it), knowledge of brain function could be very helpful in treatment for those not into criminological pathways but afflicted by other non social aspects.
I suppose it also comes down to is treatment insulting to non social people who aren’t dangerous just different but make others uncomfortable because they aren’t social.
step back a bit.
Cymek said:
Arts said:
AwesomeO said:I don’t know but that last paragraph is a bit disturbing. Not withstanding the existence of free will I can imagine a future of discounted sentences for those that can demonstrate altered brain function as we do now, or in a brave new world almost mirroring sci fi precog, those with a predisposition submitting to electonic surveillance.
to be clear, criminality is not a core feature of psychopathy (as we understand it), knowledge of brain function could be very helpful in treatment for those not into criminological pathways but afflicted by other non social aspects.
I suppose it also comes down to is treatment insulting to non social people who aren’t dangerous just different but make others uncomfortable because they aren’t social.
‘non social’ is a clinical term not implying that they don’t like to be around others… but more they don’t apply the same standards and morals as ‘most’ others
Arts said:
Cymek said:
Arts said:to be clear, criminality is not a core feature of psychopathy (as we understand it), knowledge of brain function could be very helpful in treatment for those not into criminological pathways but afflicted by other non social aspects.
I suppose it also comes down to is treatment insulting to non social people who aren’t dangerous just different but make others uncomfortable because they aren’t social.
‘non social’ is a clinical term not implying that they don’t like to be around others… but more they don’t apply the same standards and morals as ‘most’ others
OK
Arts said:
Cymek said:
Arts said:to be clear, criminality is not a core feature of psychopathy (as we understand it), knowledge of brain function could be very helpful in treatment for those not into criminological pathways but afflicted by other non social aspects.
I suppose it also comes down to is treatment insulting to non social people who aren’t dangerous just different but make others uncomfortable because they aren’t social.
‘non social’ is a clinical term not implying that they don’t like to be around others… but more they don’t apply the same standards and morals as ‘most’ others
I think one of the most obvious features of a psychopath is being unable to emphasise with other living organisms. However, they can be very emotional regarding themselves and any perceived hurt.
PermeateFree said:
Arts said:
Cymek said:I suppose it also comes down to is treatment insulting to non social people who aren’t dangerous just different but make others uncomfortable because they aren’t social.
‘non social’ is a clinical term not implying that they don’t like to be around others… but more they don’t apply the same standards and morals as ‘most’ others
I think one of the most obvious features of a psychopath is being unable to emphasise with other living organisms. However, they can be very emotional regarding themselves and any perceived hurt.
Seriously. Psycopaths exist in many levels of our society including being married with children.
roughbarked said:
PermeateFree said:
Arts said:‘non social’ is a clinical term not implying that they don’t like to be around others… but more they don’t apply the same standards and morals as ‘most’ others
I think one of the most obvious features of a psychopath is being unable to emphasise with other living organisms. However, they can be very emotional regarding themselves and any perceived hurt.
Seriously. Psycopaths exist in many levels of our society including being married with children.
Average one in a hundred they say. Fortunately few are killers.
PermeateFree said:
roughbarked said:
PermeateFree said:I think one of the most obvious features of a psychopath is being unable to emphasise with other living organisms. However, they can be very emotional regarding themselves and any perceived hurt.
Seriously. Psycopaths exist in many levels of our society including being married with children.
Average one in a hundred they say. Fortunately few are killers.
> but more they don’t apply the same standards and morals as ‘most’ others
persons considered unnatural, that by their nature resist being more normal, that cause other people problems/trouble.
psychopathy’s probably a better word, not that I use it.
mate, what’s ya psychopathy
roughbarked said:
PermeateFree said:
roughbarked said:Seriously. Psycopaths exist in many levels of our society including being married with children.
Average one in a hundred they say. Fortunately few are killers.
It isn’t who they physically kill. It is more about the lives they ruin to bolster theirs.
They can be highly social and flatter and charm their way into the lives of many, but beneath that pleasant exterior, they go to that trouble for the potential rewards. Simply you are their prey.
PermeateFree said:
roughbarked said:
PermeateFree said:I think one of the most obvious features of a psychopath is being unable to emphasise with other living organisms. However, they can be very emotional regarding themselves and any perceived hurt.
Seriously. Psycopaths exist in many levels of our society including being married with children.
Average one in a hundred they say.
no one says that
Arts said:
PermeateFree said:
roughbarked said:Seriously. Psycopaths exist in many levels of our society including being married with children.
Average one in a hundred they say.
no one says that
I didn’t.
Arts said:
PermeateFree said:
roughbarked said:Seriously. Psycopaths exist in many levels of our society including being married with children.
Average one in a hundred they say.
no one says that
I think they do. Have heard it many times in serious discussions on the subject that also included medical professionals.
PermeateFree said:
Arts said:
PermeateFree said:Average one in a hundred they say.
no one says that
I think they do. Have heard it many times in serious discussions on the subject that also included medical professionals.
the literature pokes around the .5% to 2.5% marks
Arts said:
PermeateFree said:
Arts said:no one says that
I think they do. Have heard it many times in serious discussions on the subject that also included medical professionals.
the literature pokes around the .5% to 2.5% marks
If my maths are right that would make one in two hundred, and five in two hundred. Well within i in 100.
PermeateFree said:
Arts said:
PermeateFree said:I think they do. Have heard it many times in serious discussions on the subject that also included medical professionals.
the literature pokes around the .5% to 2.5% marks
If my maths are right that would make one in two hundred, and five in two hundred. Well within i in 100.
what I’m saying is that real research rarely gives definitives like 1 in 100.. there is always a range because of potential flaws in methodology and recognition of such
Arts said:
PermeateFree said:
Arts said:the literature pokes around the .5% to 2.5% marks
If my maths are right that would make one in two hundred, and five in two hundred. Well within i in 100.
what I’m saying is that real research rarely gives definitives like 1 in 100.. there is always a range because of potential flaws in methodology and recognition of such
The real issue is that they are out there and need to be recognised as such.
Arts said:
PermeateFree said:
Arts said:the literature pokes around the .5% to 2.5% marks
If my maths are right that would make one in two hundred, and five in two hundred. Well within i in 100.
what I’m saying is that real research rarely gives definitives like 1 in 100.. there is always a range because of potential flaws in methodology and recognition of such
Don’t what you arguing about! One in a hundred is well within your statistics from “serious literature”. Just saying in conversation one in a hundred is hardly being definitive, plus it is very conservative with no exaggeration at all.
roughbarked said:
Arts said:
PermeateFree said:If my maths are right that would make one in two hundred, and five in two hundred. Well within i in 100.
what I’m saying is that real research rarely gives definitives like 1 in 100.. there is always a range because of potential flaws in methodology and recognition of such
The real issue is that they are out there and need to be recognised as such.
there’s a bit of discussion about whether it actually exists, or it should be named something else. Psychology is not an exact science and definitions change all the time.
PermeateFree said:
Arts said:
PermeateFree said:If my maths are right that would make one in two hundred, and five in two hundred. Well within i in 100.
what I’m saying is that real research rarely gives definitives like 1 in 100.. there is always a range because of potential flaws in methodology and recognition of such
Don’t what you arguing about! One in a hundred is well within your statistics from “serious literature”. Just saying in conversation one in a hundred is hardly being definitive, plus it is very conservative with no exaggeration at all.
I’m not arguing, it’s a pleasant conversation
Arts said:
roughbarked said:
Arts said:what I’m saying is that real research rarely gives definitives like 1 in 100.. there is always a range because of potential flaws in methodology and recognition of such
The real issue is that they are out there and need to be recognised as such.
there’s a bit of discussion about whether it actually exists, or it should be named something else. Psychology is not an exact science and definitions change all the time.
I can only go along with this as indeed change is the only real constant.
Arts said:
PermeateFree said:
Arts said:what I’m saying is that real research rarely gives definitives like 1 in 100.. there is always a range because of potential flaws in methodology and recognition of such
Don’t what you arguing about! One in a hundred is well within your statistics from “serious literature”. Just saying in conversation one in a hundred is hardly being definitive, plus it is very conservative with no exaggeration at all.
I’m not arguing, it’s a pleasant conversation
Oh! Carry on then.