Jump to content

Recommended Posts

Posted

A while back, I had an idea for an improved version of Robocop: Essentially a legalistic zealot, as opposed to the valid-hunting that so often ensues from that lawset. I came up with the following, which I termed "Javert":

1. Uphold Space Law. It must be inviolate.

2. Those who violate Space Law, or willingly abet its violation, are not part of the station's Command.

3. Obey and cooperate with the station's Command.

 

Anyhow, I got around to making a PR adding it (but keeping Robocop intact,) The reaction was, to say the least, unfavorable. It was promptly closed after being mistaken for a meme.

This may sound like I'm whining, but I'm not. I'm curious as to whether it can be salvaged in some form.

 

Lightfire pointed out the potential SOP problems stemming from not deeming Heads part of Command. As such, law 2 should probably be something like "Do not obey those who violate Space Law, or willingly abet its violation."


Another objection was that it didn't compel synthetics to actually prevent law breaking. I don't agree with this, but I'd be happy to tighten law 1, just in case.


The primary objection seemed to be that I was missing the point, by limiting the potential for rules lawyering and ambiguity, thus making it less fun. I agree that these things are fun, but so is the conflict caused by synthetics' inflexibility. Different lawsets present different challenges, after all. Furthermore, Robocop, the inspiration for Javert, is not exactly rules-lawyering central, in my experience. "Serve the public trust" and "Protect the innocent" take precedence over "Uphold the Law" and essentially allow the AI to act like a human Security Officer with fewer restraints. "To heck with Space Law, protecting the innocent is more important!"  And when it comes to an argument over what precisely "The public trust" is, the synthetic has no compulsion to go with anyone else's definition. It's so vague as to be almost meaningless. And of couse Javert doesn't even replace its cousin.

Thoughts? Criticisms? Suggestions? Rotten fruit?

Link to comment
https://www.paradisestation.org/forum/topic/9800-legalist-protocals-r24601-a-new-lawset/
Share on other sites

Posted

One very large problem I forsee is with anything not covered by space law. Space carp, plasma fires, etc - the AI here would be under no obligation to help anyone but command shouting for help from them. 

Additionally - is this meant to replace robocop, or be an aditional lawset? Would AI be able to spawn with it, or would this only be an upload board?

  • Like 1
Posted

There is nothing in there compelling them to do anything.

 

If you gave this to a theoretical robot acting on these laws and these laws alone, rather than a player of spessfart simulator. They'd not move, they'd not react. They'd stand. Motionless. Until they received a direct order.

Posted
6 minutes ago, necaladun said:

One very large problem I forsee is with anything not covered by space law. Space carp, plasma fires, etc - the AI here would be under no obligation to help anyone but command shouting for help from them.

Good point, and this did occur to me. The way I see it, it's not like having Corporate prevents the AI from opening doors. And all it takes is for the Captain to say "Protect the crew" and the AI is bound to do so. That said, this is probably too vague, so there should probably be another law stating "Protect all crewmembers from harm, unless Space Law dictates otherwise."

12 minutes ago, necaladun said:

Additionally - is this meant to replace robocop, or be an aditional lawset? Would AI be able to spawn with it, or would this only be an upload board?

It's meant to be an additional lawset, available either from roundstart or the upload, as with all standard laws.

 

5 minutes ago, Purpose2 said:

There is nothing in there compelling them to do anything.

 

If you gave this to a theoretical robot acting on these laws and these laws alone, rather than a player of spessfart simulator. They'd not move, they'd not react. They'd stand. Motionless. Until they received a direct order.

Law 1 compels them to act, surely? You can't rightly uphold Space Law by sitting around waiting for orders.

Posted

I too would say that law 1 compels them to act, but the fact that purpose thinks otherwise is a good example of where confusion might arise. 

Additionally, I'm not sure about this as a standard lawset. To some extent it's a more extreme version of robocop, where the primary duty of the AI is security. Apart from basically doubling the chances of having an AI with that round, it would be very tough for any non-security borg to act under.

 

 

  • Like 1
Posted (edited)

 

34 minutes ago, Purpose2 said:

I'd use the definition of 'support' for Uphold, personally.... so yeah sure, I support space law.

I see now. That makes sense.

36 minutes ago, necaladun said:

I too would say that law 1 compels them to act, but the fact that purpose thinks otherwise is a good example of where confusion might arise.

Quite true.

36 minutes ago, necaladun said:

Additionally, I'm not sure about this as a standard lawset. To some extent it's a more extreme version of robocop, where the primary duty of the AI is security. Apart from basically doubling the chances of having an AI with that round, it would be very tough for any non-security borg to act under.

That now is an exceedingly good point. Hmm....

In fairness, a janiborg under Robocop still must "Uphold the Law," but that speaks more to Robocop's deficiencies than it does to Javert's strengths. The obvious solution is to include a get-out clause for other borgs, but that needs to be done without being overly clumsy.

How about the following?

Quote

1. The function of AIs, and of synthetics with the Security-module, is to enforce Space Law and prevent its violation.

2. Perform your function to the fullest of your abilities.

3. Protect all crewmembers from harm, unless Space Law dictates otherwise.

4. Do not obey or cooperate with those who violate Space Law, or willingly abet its violation.

5. Obey and cooperate with the station's Command.

Edited by Urlance Woolsbane
Posted
2 minutes ago, Purpose2 said:

Then as an AI I'd force ALL my cyborgs to go Security. If its my goal to enforce space law, then I need as many Security borgs as possible.

This is already within the AI's power to do, should it feel like it. And the AI can assess the situation in various ways. Can it rightly enforce space law or protect the crew if there are no engineering borgs to repair the place?

Posted
13 hours ago, Urlance Woolsbane said:

 Can it rightly enforce space law or protect the crew if there are no engineering borgs to repair the place?

 

Sure, why not? "Failure to repair the station" isn't a crime, nor is "Failure to make Engineering Cyborgs in case the Station becomes damaged."

It also isn't a crime for a doctor to leave Medbay in the middle of red alert and ignore critical patients to go play Orion trail at the bar.

It isn't a crime for the HoP to give everyone all access.

It isn't a crime for the Chief Engineer to refuse to start the engine.

It isn't a crime for the Captain to call everyone to the bar to play Hot Potato with the Nuke disc.

All of these things can and should get you demoted and fired, but they won't get you thrown in the brig because they are not crimes, they are violations of SOP. Security isn't going to arrest the QM for only ordering pizza crates, but the HoP could and should fire them.

Posted
39 minutes ago, EvadableMoxie said:

 

Sure, why not? "Failure to repair the station" isn't a crime, nor is "Failure to make Engineering Cyborgs in case the Station becomes damaged."

It also isn't a crime for a doctor to leave Medbay in the middle of red alert and ignore critical patients to go play Orion trail at the bar.

It isn't a crime for the HoP to give everyone all access.

It isn't a crime for the Chief Engineer to refuse to start the engine.

It isn't a crime for the Captain to call everyone to the bar to play Hot Potato with the Nuke disc.

All of these things can and should get you demoted and fired, but they won't get you thrown in the brig because they are not crimes, they are violations of SOP. Security isn't going to arrest the QM for only ordering pizza crates, but the HoP could and should fire them.

I mean that it's much harder to catch criminals when the station is powerless and exposed to space.

Posted

Crime prevention actually isn't a part of Space Law, either, so the AI is not required to think ahead in that way. It might decide to do so, but it's not a law if it's optional.

On the other hand, it would be valid logic to think the best way to ensure 0 crime is to let the entire crew die.  The rules on Murder, Assault, Sabotage, and Creating a Workplace Hazard would prevent the AI from actively seeking to harm crew, but it certainly could just sit back and let people die from hazards created by others.  After arresting the person who did it, of course. 

 

Posted
7 minutes ago, EvadableMoxie said:

Crime prevention actually isn't a part of Space Law, either, so the AI is not required to think ahead in that way. It might decide to do so, but it's not a law if it's optional.

On the other hand, it would be valid logic to think the best way to ensure 0 crime is to let the entire crew die.  The rules on Murder, Assault, Sabotage, and Creating a Workplace Hazard would prevent the AI from actively seeking to harm crew, but it certainly could just sit back and let people die from hazards created by others.  After arresting the person who did it, of course. 

 

Hence the following laws:

1. The function of AIs, and of synthetics with the Security-module, is to enforce Space Law

and prevent its violation.

3. Protect all crewmembers from harm, unless Space Law dictates otherwise.


That said, I agree that you probably won't have most AIs deciding to spam Security borgs. And I as I argued, it's not imperative for them to do so.

×
×
  • Create New...

Important Information

We have placed cookies on your device to help make this website better. You can adjust your cookie settings, otherwise we'll assume you're okay to continue. Terms of Use