46 Comments

    1. fidelkastro on

      Under no circumstance should they be allowed to make that determination

    2. iDontRememberCorn on

      Yes, let’s let the sociopaths set the standard on what the machines can do.

    3. HAHA_goats on

      Unless they can eliminate false positives, the answer has to be no. A killed person can’t be dug out of the spam folder.

      To put a stop to tech bros trying to implement dumb shit like this, we need a body of law that doesn’t pretend ~~than~~ that nobody is accountable as long as software does things. The implementers, the vendors, and the users all have to carry accountability. And it needs to be completely untouchable by any EULA.

      >In the past, Silicon Valley has erred on the side of caution. Take it from Luckey’s co-founder, Trae Stephens. “I think the technologies that we’re building are making it possible for humans to make the right decisions about these things,” he told Kara Swisher last year. “So that there is an accountable, responsible party in the loop for all decisions that could involve lethality, obviously.”

      That suggest dumping 100% of the accountability onto users. That isn’t compatible with hiding source code and algorithms as “trade secrets”. It prevents any user from ever being able to fully weigh the software’s contribution in the decision-making process. If they want trade secrets and profits, accountability has to go along with that in all cases.

    4. Admirable_Nothing on

      Then what happens when they dedice they want to kill the good guys rather than the bad guys. That is a thorny problem that needs to be solved. Read some Asimov he solved it nicely with the Three Laws of Robotics.

    5. Oh, so the deep thinkers in their ivory towers are debating this. Listen, asshats, the horse is already out of the barn because your tech bros overlords care about profit profit profit. Not lives, not morality, not the clap trap that you spew out to each other as you engage in intellectual masturbatory exercises.

      You wanna do something? Maybe stand up with some backbone to your managers and on up the chain and start making more noise and putting your money where your mouth is.

    6. n3w4cc01_1nt on

      currently…

      flight assist and evasive maneuvers? yeah

      kill? no

      the tech isn’t refined enough and they’d have to run endless tests to make sure it’s accurate.

      that guy has some interesting arguments and his vr kill mask art piece was pretty damn good conceptual art.

      his step bro is sus af though

    7. The simple fact that they are discussing this “idea” just shows how fucked we are…

    8. Gee whiz Beav, isn’t this the same sub that had an article about autonomous attack drones already doing “work” in Ukraine? A little company called Anduril out of SoCal?

      Looks like the debaters are pointlessly arguing for fun and for their own substantial paychecks, as that type is wont to do. What a fucking timeline.

    9. And they are going to ask to this weird? Of course he is family of Matt gaetz

    10. auburnradish on

      That decision will be made by the military because they and their contractors will develop their own AI systems, not Silicon Valley.

    11. Longjumping_Sock1797 on

      So is this how humanity is going to end? So many possibilities for us ending ourselves.

    12. ArnieCunninghaam on

      I’m not listening to philosophical debates from anyone with a soul patch.

    13. JaketheSnake319 on

      Did we not learn anything from all the terminator and matrix movies?

    14. CrzyWrldOfArthurRead on

      They already are, there are missiles and drones that use AI to determine where their target is even when gps denied. They look at heat signatures and other sensor data to get their bearing and head for the (hopefully) correct target. They make a decision when presented with multiple possibilities or if the target has moved.

      It’s not new technology at all.

    15. SkeetySpeedy on

      “Silicon Valley” is not a person or people

      Who is debating? Who specifically by name is talking about this in real conversations with other real people?

    16. happy_snowy_owl on

      This question isn’t actually what it appears.

      If a commander launches a guided missile, he does so knowing that there’s some non-zero chance that the guidance system fails and it will not hit its target. There’s also a non-zero chance that the targeting information is faulty. These may result in unintended collateral damage or casualties.

      This is no different than launching a weapons system governed by AI. The commander accepts the risk in terms of probability of success prior to launch, and determines whether it meets his minimum threshold. AI has the potential to reduce inaccuracies introduced by current kill chains.

      In either case, there’s still an accountable human at the end of the kill chain, which alleviates most people’s moral and ethical qualms about the whole thing.

    17. One_Okra_2487 on

      Ahh yes the military industrial complex is military industrial complexing. It’s only a matter of time before FAANG produces warfare hardware (they already do software)

    18. SeriousMonkey2019 on

      Short answer: no
      Long answer: The decision shouldn’t be up to Silicon Valley

    19. The fact that this is even being debated just shows how fucking cracked these tech billionaires are. The answer to this question, for the survival of humanity, should always and unequivocally be… FUCKING HELL NO!!!

      IDIOTS!!

      Where is there not a GLOBAL UNIVERSAL ARTIFICIAL INTELLIGENCE CODE OF ETHICS?

      These mother fuckers have lost their GAWD DAMNED MINDS!!

    20. CrustyBappen on

      Silicon Valley will say no, but does that stop enemies of the West from doing it?

      Unfortunately the cat is out of the bag, Silicon Valley don’t make AI murder drones, the military industrial complex does that.

    21. Isn’t that one of the plot lines in the classic sci-fi novel Don’t Create the Torment Nexus?

    22. DillyDoobie on

      What Silicon Valley thinks doesn’t fucking matter in the slightest. All it takes is one random person to give AI that capability, and then everyone will be doing it. Seems like it’s only a matter of time before it’s mainstream.

    23. vomitHatSteve on

      “And my point to them is, where’s the moral high ground in a landmine that can’t tell the difference between a school bus full of kids and a Russian tank?”

      I’m sorry?! That’s your justification in paragraph 2!?

      So the answer is: no of course if you build autonomous kill bots that kill humans you are a monster. You should be executed and go straight to hell

      Now, autonomous bots that kill other robots? That’s fine. War is mostly about economy vs economy anyway

    24. It won’t be up to the techies. It will be up to the military.

      And once the military in one country concludes that autonomous weapons are the way to go because they can do more damage to the enemy faster — perhaps before the enemy has time to mount a response or even a defense — there will be little choice but for others to follow suit.

      Nuclear all over again. Powerful nation-states will have them, while agreeing not to use them, and we all hope that agreement doesn’t break down.

    25. Lootboxboy on

      I hope when they build the Torment Nexus from the hit sci-if novel *Don’t Build The Torment Nexus* they remember to paint flames on it. Flames make it go faster.

    26. GroundbreakingGur930 on

      SKYNET is the only one brave enough to ask the REAL questions.

    27. I was hoping to die at a fun 100 years old before the Terminator came true. God damn.

    28. Minister_for_Magic on

      It’s fucking pathetic that media can’t be bothered to directly call this out for what it is: Peter Thiel and his cabal of disciples all doing dystopian shit while besmirching the good names from Tolkien’s works.

      Once upon a time, VCs had a “no guns, no drugs” policy for investments. Now, Thiel has managed to repackage financing weapons manufacturers as “rebuilding American industry” and many big funds are embracing it.

    Leave A Reply