Could "fuzzing" voting, election, and judicial process improve decisionmaking and democratic outcomes?
Voting is determinative, especially where the constituency is precisely known, as with a legislature, executive council, panel of judges, gerrymandered electoral district, defined organisational membership. If you know, with high precision, who is voting, then you can determine or influence how they vote, or what the outcome will be. Which lends a certain amount of predictability (often considered as good), but also of a tyranny of the majority. This is especially true where long-standing majorities can be assured: legislatures, boards of directors, courts, ethnic or cultural majorities.
The result is a very high-stakes game in establishing majorities, influencing critical constituencies, packing courts, and gaming parliamentary and organisational procedures. But is this the best method --- both in terms of representational eqquity and of decision and goverrnance quality?
Hands down the most fascinating article I've read over the past decade is Michael Schulson's "How to choose? When your reasons are worse than useless, sometimes the most rational choice is a random stab in the dark", in Aeon. The essay, drawing heavily on Peter Stone, The Luck of the Draw: The Role of Lotteries in Decision Making (2011), which I've not read, mostly concerns decisions under uncertainty and of the risk of bad decisions. It seems to me that it also applies to periods of extreme political partisanship and division. An unlikely but possible circumstance, I'm sure....
Under many political systems, control is binary and discrete. A party with a majority in a legislature or judiciary, or control of the executive, has absolute control, barring procedural exceptions. Moreover, what results is a politics of veto power, where the bloc defining a controlling share of votes effectively controls the entire organisation. It may not be able to get its way, but it can determine which of two pluralities can reach a majority. Often in favour of its own considerations, overtly or covertly --- this is an obvious engine of corruption.
(This is why "political flexibility" often translates to more effective power than a hardline orthodoxy.)
One inspiration is a suggestion for US Supreme Court reform: greatly expand the court, hear more cases, but randomly assign a subset of judges to each case.[1] A litigant cannot know what specific magistrates will hear a case, and even a highly-packed court could produce minority-majority panels.
Where voting can be fuzzed, the majority's power is made less absolute, more uncertain, and considerations which presume that such a majority cannot be assured, one hopes, would lead to a more inclusive decisionmaking process. Some specific mechanisms;
- All members vote, but a subset of votes are considered at random. The larger the subset, the more reliably the true majority wins.
- A subset of members votes. As in the court example above.
- An executive role (presidency, leader, chairmanship) is rotated over time.
- For ranged decisions (quantitative, rather than yes/no), a value is selected randomly based on weighted support.
Concensus/majority decisionmaking tends to locked and unrepresentitive states. Fuzzing might better unlock these and increase representation.
Notes
- A selection of articles on Supreme Court reforms and expansion, from an earlier G+ post: https://web.archive.org/web/20190117114110/https://plus.google.com/104092656004159577193/posts/9btDjFcNhg1 Also, notably, court restructuring or resizing has been practiced: "Republicans Oppose Court Packing (Except When They Support It)".
- Jonathan Turley at WashPo, suggesting 19 justices:
https://www.washingtonpost.com/opinions/the-fate-of-health-care-shouldnt-come-down-to-9-justices-try-19/2012/06/22/gJQAv0gpvV_story.html - Robert W. Merry at The National Interest, agreeing:
https://nationalinterest.org/blog/the-buzz/court-packing-revisited-7123 - Michael Hiltzik at the LA Times:
http://www.latimes.com/business/hiltzik/la-fi-hiltzik-scotus-20180629-story.html - Jacob Hale Russell, at Time, suggests 27 justices:
http://time.com/5338689/supreme-court-packing/ - And Glen Harlan Reynolds, at USA Today ups the ante to 59 justices:
https://www.usatoday.com/story/opinion/2018/07/02/make-supreme-court-lots-bigger-59-justices-more-like-america-column/749326002/ - Dylan Matthews at Vox, pointing at several other suggestions:
https://www.vox.com/2018/7/2/17513520/court-packing-explained-fdr-roosevelt-new-deal-democrats-supreme-court - From the left, Todd N. Tucker at Jacobin:
https://jacobinmag.com/2018/06/supreme-court-packing-fdr-justices-appointments - Scott Lemieux at The New Republic:
https://newrepublic.com/article/148358/democrats-prepare-pack-supreme-court - Ian Millhiser at Slate:
http://www.slate.com/articles/news_and_politics/jurisprudence/2015/02/fdr_court_packing_plan_obama_and_roosevelt_s_supreme_court_standoffs.html - Zach Carter at Huffington Post:
https://www.huffingtonpost.com/entry/hey-democrats-pack-the-court_us_5b33f7a8e4b0b5e692f3f3d4 - A pseudonymous piece by "@kept_simple" at The Outline:
https://theoutline.com/post/5126/pack-the-court-judicial-appointment-scalia-is-in-hell - And a dissenting opinion from
Justice ThomasJosh Blackman at National Review:
https://www.nationalreview.com/2018/07/supreme-court-nominee-court-packing-not-feasible/ - As well as some alarm klaxon sounding from The Daily Caller:
https://dailycaller.com/2018/06/28/democrats-pack-supreme-court/
- Jonathan Turley at WashPo, suggesting 19 justices:
I think a much more elegant solution that avoids the additional democratic deficit built in randomness of fuzzing causes, and also minimizing the existing democratic deficit by having votes thrown out is to implement leveling seats.
Making votes count, even if you just get a couple percentage of the vote spread across huge geography, means that the electorate is much more fairly represented. There are a number of ways of implementation, but true for all of them are the following:
Philadelphia has a pretty decent system for city council:
The Working Families party was able to oust a Republican by strategically siphoning a few votes from the Democrats, who of course screamed bloody murder about 'risking a loss'.
How are the two non-balloted at-large seats chosen?
I believe it's the next highest vote earners, so in practice they were Republican seats.
So the Republicans were quite unhappy with losing one.
Is the idea that given an uncertain outcome, the majority would be more willing to come up with a compromise that satisfies a plurality in order to stave off total defeat? My first instinct was that the parties would simply establish their preferred position and expect their members to toe the line, by and large as happens today. but I think I could see circumstances where, if the stakes of losing are perceived to be too high, or one possible outcome is seen to be too extreme, the uncertain outcome could drive compromise.
Another procedural point (somewhat tangential to the sortition proposal) would be to make the voting process anonymous, both for the member’s who’s votes are selected and how they voted, to hopefully dilute the power of the leadership in the parties further, and give members the ability to vote how they feel without hyper partisan backlash.
There are a few points at play, and as I dive into Schulson's sources and revisit the court-reform proposals several are addressed there:
There are some possible negatives:
Unlike vote-levelling, though, vote fuzzing achieves reduced partisanship through a procedural rather than structural change. I see fuzzing as a more generalised solution.
I've been vague about just how "fuzzed" votes should be. I'm not sure, and possibilities range from drawing a single ballot from a cast set (this resembles the original Greek method -- counting all votes was far more difficult), to drawing a sample of cast votes. With a true random sample the accuracy of an estimator is almost entirely independent of sample size. For a fuzzing operation, smaller samples would be preferable in the sense of increasing randomness. Generally, "large sample" statistics begin at a sample size of 30, or n=30. With a history of votes, it's possible to model possible alternative outcomes through Monte Carlo modeling: repeatedly re-running the election by selecting different subsets of the reported cast vote. The number of times subsets of given votes differ from the total count shows the level of random fuzzing.
How would you ever hold politicians accountable if all their votes were anonymized? Politicians could simply claim to their constituents and the press that they voted one way, when they actually did the opposite. And a a result, that sort of system would be absolutely ripe for corruption and bribery, IMO.
Hmm, good point. I’ll admit it’s not my brightest idea ever, but I also don’t know if it’s that much worse than our current system of non-transparency on the influence end (e.g., the ‘legal’ corruption of PACs and dark money here in the states), and hyper transparency and public excoriation at perceived disobedience that makes it effectively impossible for most politicians to go against the party line. I’m not sure how to get out of that without providing some measure of anonymity (probably less than full), unless the fuzzing itself can change the structural and procedural incentives enough to break the accelerating trend towards hyperpartisan top-down leadership, as @dredmorbius suggests.
Fuzzed votes need not be anonymised.
The anonymous vote has value for the electorate, to avoid vote buying or selling. Though fuzzing itself is at least a partial protection against this.
For representatives or judges, selectively dropping votes is one method already practiced, in Olympic figure skating, for example:
From TFA.
For elected reepresentatives, all votes could be recorded but only a subset selected. Again, at random.