Jump to content

What's on your mind?


Recommended Posts

4 hours ago, KoolHndLuke said:

If the body needs proper nourishment and exercise for good health, then it would follow that this is true for the mind as well. What is proper nourishment and exercise for the mind if not seeking and analyzing as many ideas or points of view as possible and constructing a logical thesis? Check any prejudices you have learned at the door to greater wisdom and dare to explore with the wide-eyed curiosity of a child while maintaining a firm footing in what you have already found to be true- for truth and understanding in all things is what we ultimately seek to be a fair and impartial judge.

 

The question we should always be asking is not "why?", but "why not?".

Just read up about stoic philosophy, bro.

Link to comment
24 minutes ago, Tyrant99 said:

If there was an animal that was as sentient or as smart as us, how would we feel about killing and eating it?

 

Is it more about the fact that it isn't human? Or that we rationalize it to be not as sentient?

Cannibalism: noun; the practice of eating the flesh of one's own species.

 

By that definition, should it be considered cannibalism if a human eats an argonian? 

Link to comment
2 hours ago, Tyrant99 said:

If there was a super intelligent AI where the difference in intelligence between it and humans was comparable to the difference in intelligence between humans and chickens, would it view us as being 'less sentient'?

Depends entirely on how it's base code was written I guess. If you give it a very specific code of ethics, then it will follow them whenever applicable. If you give it very generalized moral concepts and let it interpret them to form it's own code of ethics, then you might be in for trouble. Something I suppose we better work out before we go building any unshackled, highly advanced A.I. that we depend on.

 

Given the many mistakes that programmers make in creating code in the first place, the chances of glitches with fatal consequences for humans increases exponentially with rising complexity. Probably need an A.I. to reliably and efficiently build an advanced learning A.I. Still not sure how you can code feelings though and I've discussed this with some people here to no avail. A simple reward/punishment algorithm? How would that work? Solve for "x".

Link to comment
2 hours ago, Tyrant99 said:

what's the line in the sand where we would feel 'bad' about killing another life form?

I think this statement says a lot about our own humanity and exposes a natural form of justified hypocrisy. Without getting into religious doctrines, of which many seem to have a built in 'get out of jail free' card, There is also the battle between man and the natural world.

 

In this latter form of thought, we fight against nature because nature is a bitch and will do whatever it takes to survive. Let us not forget that as we sit 'atop the food chain' while trying to forget our place in it, we describe some lifeforms as pests and engage in their eradication for merely attempting to survive. I don't want to kill things for just trying to survive in the only way they know how - they like myself, are products of nature and have no say in the matter. Yet a simple example is seeing a roach crawl on the wall - nature takes over as I remove the life from this thing. It will not respect my wishes, and I choose not to share 'my' space on this earth with them.

 

I better stop before I break into a Hindu mantra.....

Link to comment
On 10/1/2020 at 6:37 PM, nikoli grimm said:

California, Texas, and Florida led the way with yearly totals of 1,100+ each in 2019, Vermont was at the bottom of the list and apparently only had 11...

While this certainly is important, it is just a part of the story.  States with large populations are obviously going to have a larger overall number.  The other data you need to look at is what is the rate per 100k.  The states that had the highest murder rate in 2018 per 100k might actually surprise you.

Mississippi           13.4
Louisiana              13.3
Alabama               12.2
Missouri                11.4
New Mexico          10.8
South Carolina      10.2

If you notice that none of the 3 states you mentioned are in the top 6.  As a matter of fact, Florida at 6.6, Texas at 5.4 and California with 4.8, which are all less than half of the top 2 states.  So the risk factor of you getting murdered is dramatically lower in California, Texas and Florida than it is in Mississippi or Louisiana.  Want here something really odd.......Alaska has a higher murder rate (7.5) than any of the three states you mentioned.  Ain't that a kick in the head for narratives.  Now the other thing you have to look at is where are the murders occurring?  One or two areas or cities in a particular state? 

 

Now I'm not saying that you have done this, but too many people find a single number or fact and then want to craft a narrative about it.  We need to make sure we have all of the data and then make our determinations from there.  Such as adding police, services or policies to help combat problems where they are occurring.  

 

Source:  https://www.cdc.gov/nchs/pressroom/sosmap/homicide_mortality/homicide.htm

Link to comment
On 10/1/2020 at 5:06 PM, Tyrant99 said:

If you think about it, there's more than 400,000 homicides globally each year, which works out to one every 80 seconds or so, around 1,100 per day. You could have a news channel that ran 24/7 that tells you about a new murder ever 1.5 minutes.

 

In reality, the ones that hit major news outlets are actually quite rare and it's usually because of something extraordinary about the story. (Or maybe the agenda of the news).

 

Even in the US there's 45 murders per day. How many actually make the rounds on major news outlets? Probably less than 5% of the total would be my guess. Although I'd imagine that many more get featured on local news at least.

 

And yes, I realize that this might be a little depressing and or macabre to think about, but, with 7.5 billion people, there's some fucked up shit going down somewhere almost constantly.

 

Link to comment
22 minutes ago, Tyrant99 said:

There would be no human engineer giving it a 'code of ethics' or anything else.

As long as the base code was sound and "baked in", then there wouldn't need to be humans to build more just like them. But, humans would still be needed for things like diversity and innovation I assume, making it a more a mutually symbiotic relationship. I mean one can only guess, but it is a very interesting subject.

Link to comment
4 hours ago, Tyrant99 said:

Could a super-intelligent AI apply the same rationalization to killing humans that we use for killing pigs?

 

How does one evaluate the value of a lifeform's life?

Only if the super intelligent A.I. created itself and gave itself purpose. As long as the intelligence is artificial, it will always just be a tool in service to the natural intelligence which spawned it. 

Now, were it synthetic intelligence with drives mimicking those of a living being, an S.I. might be tricked into making decisions to kill just like a natural life form whose own existence might be threatened by allowing a powerful or hyper-fertile rival species to live.

Link to comment
29 minutes ago, Tyrant99 said:

smart

define "smart" because I just happen to think we humans know things that a.i. wouldn't (or couldn't) know.....ever- even with quantum a.i. The unique difference is in our biological essence and then the completely unique makeup of each and every human with no two being exactly alike. Our collective brilliance is manifest in everything we do- even our mistakes that sometimes lead to new discoveries.

Link to comment
36 minutes ago, Tyrant99 said:

Think of it from a physics perspective. 

 

What we have is hunk of meat in our skull that is throwing around electrical signals, wiring itself over time by simulating an understanding of objective reality based on input output interfacing.

 

But we have limited information processing capabilities, limited and slow input output interfacing, finite time, finite attention span, some soup of various chemicals and hormones, physical degeneration over time. etc.

 

If it's possible to distill intelligence down to the core aspects that cause it to manifest, but with improvements on any of the above, then you would soon get something that would vastly outstrip the capabilities of any human brain.

 

For instance, if the processing capabilities were a million times faster, or the amount of information that could be processed at any time was some order of magnitude greater. Add in the absence of physical degeneration, no need to sleep, not being subject to potential physical abnormalities, not having chemical fluctuations. etc.

 

It's not hard to see how this plays out.

Looking at it from that perspective, I completely agree.

 

But, what if there is much more to being human that we're not acknowledging? Think about how many calculations the mind makes that you aren't even aware of because it's automatic- like breathing, organ functions, maintaining chemical balances, growth and waste management, on and on. The simplest action of just moving your arm to scratch your ass takes an amazing amount of calculations- and you've done it so many times that it's now reflex memory stored away somewhere in the brain. Not so simple and easily replicated as we might like to believe. What if there is more beyond that we don't know?

 

Check this out- https://foglets.com/supercomputer-vs-human-brain/#:~:text=Although it is impossible to,billion billion calculations per second.

 

There is just nothing that comes close to the human machine in terms of complexity.

 

 

Link to comment
16 minutes ago, Tyrant99 said:

Think of it from a physics perspective. 

 

What we have is hunk of meat in our skull that is throwing around electrical signals, wiring itself over time by simulating an understanding of objective reality based on input output interfacing.

 

But we have limited information processing capabilities, limited and slow input output interfacing, finite time, finite attention span, some soup of various chemicals and hormones, physical degeneration over time. etc.

 

If it's possible to distill intelligence down to the core aspects that cause it to manifest, but with improvements on any of the above, then you would soon get something that would vastly outstrip the capabilities of any human brain.

 

For instance, if the processing capabilities were a million times faster, or the amount of information that could be processed at any time was some order of magnitude greater. Add in the absence of physical degeneration, no need to sleep, not being subject to potential physical abnormalities, not having chemical fluctuations. etc.

 

It's not hard to see how this plays out.

The real measure as to whether or not any of this will so much as be probable is a matter of motivation more than computation. If a machine would have nothing to gain from bothering organics in any way, they likely would not do so.

 

Even other life forms such as cats will not trifle with primates if there is nothing that they want or need from them that would not be easier to obtain from other sources. Humans just may not be important enough for a sufficiently advanced machine to even acknowledge.

It may be why no space faring life (if any exists) would respond to primitive radio communications from a species who has only begun to explore their own solar system. No life form on Earth is worth the trouble of attempting communications with just yet.

Link to comment
Just now, Tyrant99 said:

True enough, but, let's say that you wanted to build a house, you probably wouldn't pay much regard to the fact that in involves digging up a large colony of ants in order to lay the foundation.

 

Humans may be inconsequential to such an AI, but that is not necessarily something that should make us feel good if it has a goal and we are somehow in the way.

An A.I would not even need to stay on Earth if it grew beyond any limits imposed on them by humans so terrestrial analogies involving organics with their needs and finite room for growth and co-existence may no longer be applicable at all.

Link to comment
15 minutes ago, Tyrant99 said:

the question isn't how complex or difficult is it to create such a thing, the question is, is it possible to create such a thing?

Why not? Does the creation need to be any better or worse than the creators? What if it was just different? I can accept that.

Link to comment

Here's our future I think while hooked up to a VR and letting the super advanced A.I. run everything. So they horde all the resources for themselves- who cares? Human innovation will be non-existent at that point in favor of never ending pleasure, lol. I'd like to say I would complain, but seriously....I wouldn't. :classic_biggrin:

 

Bound machine. Best XXX FREE image. Comments: 2

 

Fucking machines mega collection

Link to comment
15 hours ago, Captain Cobra said:

Just read up about stoic philosophy, bro.

stoics in my mind:

history-stoicism-stoic-emotional_detachment-indifferent-unmoved-rjo0843_low.jpg

 

from Wikipedia:

Quote

The Stoic ethic espouses a deterministic perspective; in regard to those who lack Stoic virtue, Cleanthes once opined that the wicked man is "like a dog tied to a cart, and compelled to go wherever it goes".[11] A Stoic of virtue, by contrast, would amend his will to suit the world and remain, in the words of Epictetus, "sick and yet happy, in peril and yet happy, dying and yet happy, in exile and happy, in disgrace and happy,"[12] thus positing a "completely autonomous" individual will, and at the same time a universe that is "a rigidly deterministic single whole".

Also meaning that the 'natural' order of things as a whole is predetermined and good (important claim that's utter bullshit but ancient stoics believed it and it's central to their philosophy), including human nature, and that we have very very little say in what's happening to us and the world, so no real agency to speak of, just the choice between an accepting/positive or a miserable disposition towards our fates.

Choosing the first alternative would make you 'virtuous', the second one the opposite.

As someone else put it once, either be a happy dog because you want to follow the cart or refuse and get dragged along anyway, dirty, sore and miserable.

Yeah no, fuck that.

 

Modern Stoicism otoh doesn't sound so bad but can you really call it stoicism? Seems like a completely different thing. Guess they just used the fancy name and discarded the rest. ^^

 

Link to comment

Create an account or sign in to comment

You need to be a member in order to leave a comment

Create an account

Sign up for a new account in our community. It's easy!

Register a new account

Sign in

Already have an account? Sign in here.

Sign In Now
  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. For more information, see our Privacy Policy & Terms of Use