Star Army

Star ArmyⓇ is a landmark of forum roleplaying. Opened in 2002, Star Army is like an internet clubhouse for people who love roleplaying, art, and worldbuilding. Anyone 18 or older may join for free. New members are welcome! Use the "Register" button below.

Note: This is a play-by-post RPG site. If you're looking for the tabletop miniatures wargame "5150: Star Army" instead, see Two Hour Wargames.

  • If you were supposed to get an email from the forum but didn't (e.g. to verify your account for registration), email Wes at [email protected] or talk to me on Discord for help. Sometimes the server hits our limit of emails we can send per hour.
  • Get in our Discord chat! Discord.gg/stararmy
  • 📅 July 2024 is YE 46.5 in the RP.

10 Robots Per Soldier?

Wes

Founder & Admin
Staff Member
🌸 FM of Yamatai
🎖️ Game Master
Discord Booster
🎨 Media Gallery
By 2033, the US Military is predicted to have robots outnumber its human soldiers by 10 to 1. We're already seeing robots taking on tasks like ordnance disposal, scouting, political assassination, and on the ground, and soon they're expected to carry things (think of the BigDog robot and such). In South Korea, lethal drone turrets watch the DMZ.

Thoughts on this: Drones don't have ethics. They have no problem with murdering on the behalf of the plutocracy who can afford to pay for them. American drone strikes have already killed 4 American citizens with basically no consequences including the killing of a 16-year-old American boy in violation of the Constitution, which guaranteed him the right to due process. Once we have completely autonomous robots that don't even need a human behind the screen, the potential for abuse becomes even more extreme.

I'm reminded of this scene from Elysium. "I would like them dead," the rich business guy orders the robots.

I think within 20 years we're going to see driverless cop cars that roam the city scanning people's faces and license plates. We'll see immigrant-hunting drones with thermal imagers on the borders (already starting to happen I think). Maybe we'll see Army units with automated logistics trains and self-deploying "smart" land mines.

What implications does this have for a far-future universe like Star Army?

For one, I think this is why the Star Army uses Nekovalkyrja instead of robots, because Nekovalkyrja will think about what they are doing instead of just blindly following orders. It helps us stay less capable of evil.
 
I think the collusion between economic entities and political entities is unavoidable. I think to try and ignore their marrying and subversion and the counter-cultures that will form as a result is naive.

In any revolution there are three sequential steps and they correlate directly with the three pillars of power:

Step 1: The ideological revolution:
Undermining belief systems that support the control of any given entity. This is the systematic erosion of legitimacy -- exposing criminal elements and abuses within governments and inspiring discontent among the citizens the state depends on in order to function.
  • This is actually happening right now in the western world thanks to instant mass communications such as radio, television and internet and is primarily aimed at holdovers from colonialism and imperialism.
Step 2: Strategic non-compliance:
The goal here is to interrupt the chain of obedience wherever possible as often as possible and reveal the breaks and damage in the chain to ensure people are aware that it is happening -- and then to document the police and military brutality responses which follow and then distribute that information throughout the public to ensure they're aware of what is happening.

This damages the ruling party's image since power is all image. When image breaks down, people replicate this behaviour and eventually it hits a point of critical mass.

  • Importantly, revolutions are almost entirely psychological: Building confidence is the backbone and thus small low-stakes events are far more successful than large high-stakes events.

Step 3: Exorcism of monopoly on violence
When the momentum momentum hits critical mass, the monopoly of the exercise on violence is removed from the current regime. Usually its relatively peaceful by making the police and military (the enforcement arm of the state - without which the state has no power) side with the people over the pre-existing regime. Its important for them to know the people WILL support them IF they break the chain of command.
  • Falling governments almost always resort to brutal repression and intense regulation of armaments, information and technologies to stay in power but this backfires: even one refusal to follow an immoral order chain reacts into an explosive refusal to follow all future immoral orders by others thus destroying the illusion of authority. When this happens, its essentially game-over.
  • Example: East Germany Nov 7th 1989, the communist government put orders to the military to stop mass-protests which had been building that year. The commander of the army refused and as a result and ordered his men to stand down. This sent a clear message the communist party wasn't in control and thus communism fell.
The problem here is that automated systems negate Step 3 from occurring peacefully since a weapon cannot question its orders: It merely acts on them -- and by removing soldiers from weapons, there is no way that the monopoly on violence can be taken away by people who refuse to act on immoral orders.

As a direct result, revolt by design becomes very messy and the monopoly on violence can only be overcome through a direct response of violence to which most western nations would be better prepared than their people are to fight since they're spending the people's money in order to establish the armaments of their regime.

Today our safety net as a people in the western world is built in the belief that the technical and intellectual capabilities of the citizens will allow us to exercise control over these tools and subvert these seemingly controllable elements.

I would argue its a false comfort and that automated weapons are one of the greatest threats to democracy and human freedom humanity has ever known since they by design can only accept instructions from the bodies that own them. Humanising war by design partially democratises it since there comes a point at which people will just say no, even if it is ridiculous and extreme (as it was within WWII Germany where internal revolts began snowballing internally which became a huge distraction for the axis making them fight an internal war, not just an external war, hugely marring their strategic potential which was why they poured so much money into weapons innovation to try and counter this by just having the sharper sword).

To this end, the Nekovalkyjra is an attempt to overcome this problem. The problem becomes: Do Nekovalkyjra ever say no to immoral orders or does the illusion merely exist that they have the capacity to do so?

Think about that before you go to bed.






Sources:
  1. "The Psychology of Revolution" by Gustave Le Bon
  2. "From Dictatorship to Democracy" by Gene Sharp
 
In the Nesha Kingdom, all of their unmanned drones are strictly monitored by people to ensure no abuse happens; but their drones are mainly designed to keep the peace. Law Enforcement is handled almost strictly by the people, but drones are designed to patrol outside of cities or in the farm lands to ensure no one harms the local food supplies. Drones are also designed to provide protection in areas where a personal force is unable to function currently (such as some of the underground tunnels that were built earlier in the colonization, some of those tunnels are now toxic and can't be passed through by people - but the drones can patrol them easily)

There are other uses for the drones within the Kingdom.
 
Here's an insigtful comment from user fuzzyfuzzyfungus on Slashdot:
How sophisticated does a guidance system have to be before it qualifies as a (rather suicidal) robotic soldier?

While there seems to be a bit of a taboo about handing a robot a gun and telling it 'yeah, just frag anything that looks particularly infrared in that direction', heat-seeking missiles, with no human terminal guidance, have been available for years.

We don't have anything that makes broader strategic decisions; but if you count robots attached to their munitions, we've been letting robots make kill decisions, within a confined search space, autonomously for some time. They just don't get to come back afterward.
It's a good point. The Star Army uses homing torpedoes and missiles extensively. They aren't sapient, though.
 
Typically drones are usually used to administer force rather than diplomacy since diplomacy usually relies on a human element (thus meaning anywhere the potential for diplomatic decision making is required, humans will be required until diplomatic functions can be synthesised accurately and digitally not only on the basis of legality but also in terms of ethics).

The next step is usually to consider the likely scenarios which automated weapons are likely not only to encounter but as to what they are likely to actually be ideal responses for: For example, while robots are an excellent choice in lethal force scenarios, they lack the intelligence to diffuse a bomb without operator assistance or remote instruction.

To this end, drones are usually best used for the exercise of the force spectrum which consists of:
  1. Investigation: The act of appearing on scene providing information to dispatcher and strategic information command who will then issue proceeding instructions, networked information from others present. This is also the first step of diplomatic function.
  2. Loiter: Passive presence. Provides ongoing areawide information and provides response potential as deterrent. Usually being observed is enough to prevent many actions.
  3. Intimidate: Active presence: The act of direct engagement with potential threats on a psychological basis through the use of verbal commands and often threats of potential outcomes for non-compliance with instructions.
  4. Non-lethal force: A response which disables a perceived threat or agent. Usually used either for legal reasons or for the collection of a given agent for further assessment. This is often used to prove a demonstration of intimidation in order to break the confidence of organisation of those present.
  5. Lethal force: A response which removes a perceived threat or agent entirely. Again, often as a threat or demonstration but primarily for the neutralisation of an opposing perceived force.
An often unforeseen issue is the problem of leveraging operator decision making with agents in a given scenario: That is, humans who are present providing information and suggestions to drones in order to guide and sculpt their decisions. Since the awareness and foresight of a person tends to be greater than that of a machine, this is information which should seriously be considered.

To this end, if diplomacy is necessary the best use of automated weapons isn't so much entirely independent but through the use of drones as a tool and force multiplier for those present in the scenario in question (either through telecommunications with the drones doubling as their senses or through direct presence for heightened awareness, reduced latency and thus improved decision making).

On paper, you would want to design drones especially for these tasks: investigation and loitering for example could be achieved with a tennis-ball sized observer drone. Intimidating, non-lethal force and lethal-force could be provided by a secondary system which is issued to the theatre such as either an aircraft or perhaps a ground unit.

In theory, wouldn't it be really smart to allow an operator to issue instructions to equipment without explicit attention or presence of the operator themselves?

Picture it: Nekovalkyjra stepping out of their power-armour in a show of faith to then exercise diplomacy unbridled and far less intimidating while the power-armour continues to provide presence of intimidation and information gathering as an extension of her body knowing everything it knows and being the brain controlling it.

It's the best of both worlds.

And when the shit hits the fan? She climbs back inside and resumes combat operation.

She could even carry the rifle with her while it uses its integrated systems so they can attack a target from two separate positions -- or even use the armour as a deco while she escapes.

On paper, weapons-pods also need to be made more... Diverse. Capable of limited independent movement and automated responses instead of just being a glorified missile with a gun on it. Their strategic potential is tremendous.

On paper an even better response would be creating gear for a power-armour extending its range, detection and armament potential to match that of a light frame. The user could then disembark and decouple and the frame continue to provide automated action on the battlefield: As a mobile AWACS/jamming system, heavy weapons-platform, advanced range warning system and even as a pack-horse for lifting heavy mission specific equipment. On paper, it could move as an aircraft or even as a light-tank about the size of a motor-cycle not unlike the spider-tanks from Ghost in the Shell.
 
Last edited:
I think one thing worth considering here is the distinction between drones and robots. Osaka touches on this, but I think it's important to remember that we have not contemporaneously seen any truly autonomous weapon systems. The guidance systems Wes brings up aside, I really don't see any meaningful difference between a contemporary drone and normal attack aircraft. I prefer the term Remotely Piloted Vehicle (RPV) over drone or UAV, because I think it really gets to the ethical point: humans have always found it easier to kill when they are removed by some degree from the situation. Whether that is literal physical distance, as with an RPV or the Navy firing Tomahawks, or psychological by dehumanizing the enemy. With platforms like the Reaper and Predator, there are several live human beings in control of it at all times and who physically pull the trigger. A true, autonomous robotic platform brings in a more deontological question. If we're designing the rules that these systems operate by, the ethical analysis returns to that initial design.

For SARP, I think the interesting elements are less in terms of the capability of RPV or robotic platforms. It's clear that advanced factions like Yamatai are capable of building robot and AI systems all the way from dumb, rote machines to fully sapient, sentient beings (Nekovalkyrja and MEGAMI, for instance). The more intriguing question comes done to what degree of ethical independence do you choose to give those systems in the first place. Do you let them call the shots, let them decide whether to follow an order or not? Or do you make them obedient, instantly obeying any directive from an authorized source?
 
Yamatai clearly does not have a mental monopoly on its Nekovalkyrja soldiers. They've made plenty of decisions that might or might not be popular, and some have either broken off or outright betrayed the Empire.
 
Breaking news: Amazon has parcel delivery drones it wants to use to deliver your small orders. Whoa! I didn't expect them so soon. If the FAA gives approvals, we could be seeing this as a common courier method within a decade.

Also, a company is working on a "night watchman" robot:
Mr. Li envisions a world of K5 security bots patrolling schools and communities, in what would amount to a 21st-century version of a neighborhood watch. The all-seeing mobile robots will eventually be wirelessly connected to a centralized data server, where they will have access to “big data,” making it possible to recognize faces, license plates and other suspicious anomalies.
 
Here's another awesome robot system I discovered via Slashdot: Amazon is using around 1,400 Kiva Systems robots to move and sort their warehouse inventory. The workers don't need to walk around, the actual shelves come to them. Wow! Star Army logistics should be using this!

Watch the video:
 
Those Amazon drones are going to have a very limited range. You have some of those flying around here, I guarantee someone is going to be skeet shooting the drones to get the goodies.
 
Those Amazon drones are going to have a very limited range. You have some of those flying around here, I guarantee someone is going to be skeet shooting the drones to get the goodies.
It's got a GPS and cameras. There's a good chance that anyone trying to do this would be caught.
 
Also the goodies are suspended under the drone its self, therefor said shooter would be shooting the goods before the drone in most cases. But you are otherwise correct, these would have to have an extremely limited range, but with Amazon putting distribution points in just about every state, and multiples in some of them, it is getting to be more and more realistic.
 
Have Amazon stated this to be cost-effective? Its very interesting and ideal in many cases but I'd prefer to have a company developing drone cars and drone delivery systems for said cars. Bulk unmanned delivery is going to be the next big thing.

Oh here's a thought, Wes: Why not have it so whatever roads and paths are on Yamatai power the cars on them and the pedestrians' devices walking/driving over them and then bill them for their usage OR pay for it out of taxes. You could use the same system to re-enforce a wireless communications system (which links VERY quickly with a mainline highspeed wired infrastructure)... In turn, the road could even be modular so when a part is damaged, you close the specific block and route traffic around it and then just fly the damaged block out and fit a new one where the old one used to be and traffic is back up within the hour.
 
RPG-D RPGfix
Back
Top