Algorithm Integrity

latest update 11Nov 22:00 MST

(all dates 2022, click above link to jump to latest entry)



This is an ongoing document, to be frequently updated and revised where necessary, on Algorithm Integrity which studies the question of whether published software is good (conforms to God's Law) or evil (does not), and what to do about it.  Published software is any software that controls machines (surveillance cameras, facial recognition, artificial "intelligence") or online information (Google search, GMaps, FaceBook, Twitter, etc.) 

Many examples of harm being done may be found.  My objective is to collect source material from many of the technical sites and podcasts I follow, organize it to teach interested followers the pertinent details, and to consider ways to effectively address egregious cases.  Because this is just the seed of the idea, there is no predicting yet what will become of my project, but I firmly believe it is a subject of special interest to Christians and there will be more than a few who may find opportunity for themselves and/or their family members to help set standards and ultimately influence positive change.


What is an algorithm?

An algorithm is a procedure, a recipe, that can be performed by a machine or by a person.  A good example is my algorithm for setting up a phone call:  1) send text to person's smart phone asking "is now a good time for a phone call?", 2) call if answer "yes", or 3) wait amount of time if time given in response, or 4) wait <specific amount of time>,  repeat from step 1. 

Simple, only four steps, but a polite way to renew voice contact with the party being called.  Obviously, a machine could also do these four steps but that also shows how algorithmic abuse can occur (likely inadvertent in this case) because if the machine runs this without further consideration of possible consequences, we might find it sending texts out every hour all night long around the clock.  For a machine, we cannot assume common sense or any understanding of politeness. 

References

Consciousness Revisited - EETimes interview with Federico Faggin, a titan of the semiconductor industry, making the case that computers can NOT replicate human consciousness.


Silicon - Autobiography of Federico Faggin ("Fah-gene' ").  I was led to this book by the above interview.  It has some very technical parts that describe Faggin's seminal role in the microelectronics industry as a prolific inventor of key technologies  like "Silicon Gate Technology" (SGT), and the first microprocessor.  At the time he was working out these technologies, I was involved in test instrumentation and later something called a Microprocessor Development System that was essential for efficiently programming the flood of new Microprocessors that came out of Silicon Valley California in the 1970s.  Revisiting that history that paralleled my own career (although at a very peripheral level compared to Faggin), was a fascinating and wonderful trip down memory lane for me, to the point that I found it almost impossible to put the book down. 

Pertinent to the Christian perspective, he was raised a Catholic in northern Italy (born 5 days before Japan attacked Pearl  Harbor), and early on describes perceiving "a deep connection with Jesus" although so far (3/4 through the book) he does not testify to being saved.  His focus on consciousness is described in a coming chapter that I have yet to read, and will comment on further perhaps following that reading. 

To close the loop on Mr Faggin's autobiographical book, he never makes the leap to recognizing God's role in creation.  But I applaud his recognition of the difference between computers (incapable by their very nature of anything beyond mechanical execution of a program), and living things.  The published worries of prominent commentators like Elon Musk, and Stephen Hawking warning about sentient, conscious machines taking over our lives are quite effectively discounted by Faggin's arguments. 

DLT (posted Feb 8th)

The next topic I want to introduce is distributed ledger technology.  DLT is the preferred acronym of insiders.  It is defined by wikipedia (an adequate if not eloquent definition) as "a concensus of replicated, shared, and synchronized digital data geographically spread across multiple sites, countries, or institutions" which provably secures its data outside the authority of any central administrator.  DLT is more popularly known as Blockchain Technology.  It is a technical breakthrough of enormous consequence first published in the 2009 white paper by Satashi Nakamoto (pseudonym).  This technology is the foundation of the token "Bitcoin", the very first such token, and of thousands tokens that have arisen since.  To get a quick overview of how many and key characteristics of each, checkout Coingecko

While my exposure to this technology is still neophyte level, I have learned enough to recognize its potential to bring new levels of freedom and independence to people around the world.  ...And as in all things technology, it can be abused by those with satanic intent to further oppress and exploit humanity.  I therefore assert that it must be included in any effort to achieve algorithm integrity and will be a major topic of this study.  

Feb10 - Investigation of GTC* per Epicenter podcast interview with founder Kevin Owocki

This young man is filled with ideas for programming money via DLT for social and human "good".  He's obviously brilliant, but also of a young age that has clearly been affected by what our Christian community  considers woke ideas such as prioritizing Climate Change as a top, most important problem to address algorithmically.  Young Mr. Owocki's interview is candid and has more than few very illuminating observations, but the concern is that his intelligence, programming skill, and overall crypto talent runs the risk of implementing algorithmic governance which imbeds woke and likely unbiblical rules.   The resulting DLT implementation has the potential to force hardship on humans and thus we must be aware and stay on top of every project to make sure it has algorithmic integrity.

*GTC is the symbol for Gitcoin, ranked #400 token by market cap in Coingecko.  Anything in the top 500 tokens (out of >10,000 tokens) has proven staying power and thus could be a threat if implemented without complete integrity.
  
Feb13 - I want AI to stand for Algorithm Integrity.  Forget AI as Artificial Intelligence.  Federico Faggin shows us there is no such thing.   Machines can only respond to situations they already have a program for.  They cannot reason.  They never will.  So when considering  Algorithm Integrity, the topic of this study, do NOT try to think of ways any programmable machine can solve the problem.  Rather consider whether the proposed algorithm is implemented with INTEGRITY or not.  Learn to recognize what each case is.  This is the first function of an auditor.  Also consider possible markets for AUDITED ALGORITHMS.


Feb20 - Consideration of Geordie Rose quantum computing claims - My initial reaction:  there is one creation - God breathed it all into existence in 6 days.  Parallel universes, each with variations across all statistical possibilies certainly can be imagined by man as Mr Rose is demonstrating... but always the qualification so far is "if you could build one".  This is the same trap proponents of string theory fall into.  The math is so beautiful, they exclaim.  If we just tweak a few numbers to make our theory compute out completely, it "proves" string theory.  To my knowledge, no evidence from the physical world confirms any aspect of string theory to this day (a claim that proponents constantly challenge, but in my view, physicist Lee Smolen thoroughly debunks in his book, The Trouble with Physics, especially the chapter "Beyond String Theory - Surprises from the Real World").

Geordie continues..."Can we build machines like us?   Intelligent machines are very exciting to me", he says.  (He's in love with his creation...)  Other comments from his vid indented below:

2^500 parallel universes, if we are smart enough, we can pull aspects from multiple such "universes" together.  This is based on doubling the number of qubits that can be fabricated onto a chip every year.  Nothing he said proves one universe per qubit - it is nothing but fanciful conjecture, imho.

his own "Moore's law" is 2x per year (Gordon's was 1.5x / year)

Reference picture of Technology Review cover (a magazine I am quite familiar with because it is distributed to alumni).  Here, my ire concerning the extent to which woke reasoning has infected MIT's institutional thinking is significant, although I do accept his assertion that successful advances to date are trivial compared to the imagined future picture he paints, despite it being total fantasy at this stage.

His Predictions:
1. within 5 years we'll find another earth like planet (no connection with his stated topic)
2. gravitational lens - will be able to test (as far as I know, this has already been proven)
3. believes machines will outpace us in everything within 15 years

To prediction #3 in particular, I say NO.  The arguments supporting this the best are Federico Faggin's above (see Consciousness Revisited and Silicon). 

I stand by my Feb13 post. 

Feb22 - Epicenter podcast episode 431 interviews the founders of the NEAR smart contract platform which has several innovations (very technical - listen for yourself if interested in such detail), but which for me was most important for introducing me to the programming language they used (RUST), and the library of functions they have published in open source to make writing contracts on this platform more accessible, secure and correct by construction.  This is one possible answer to the question "what should my kid study?"  With so many possibilities for DLT projects, programming languages, and ecosystems to choose from, it is very helpful to have input from a successful project (ranked #27 in Market Cap as of this writing - worth  ~US$ 5.6 billion) where demand for talent is strong and opportunity to make a real contribution and get paid for it is realistic (not an unreachable dream).

Feb26 - Epicenter podcast Episode 432 published just this past Tuesday, introduced me to the AssangeDAO which helped me understand details of NFT creation and distribution and how their first NFT raised over $50 million dollars for Julian Assange (now being held in Belmarsh prison in South East London while he fights on in UK court to prevent extradition to the US).  The 10Dec2021 win by the US galvanized a crypto saavy group of supporters to quickly create an NFT, put it out to the world and create a $50 million war chest for legal defence.  This kind of success was inconceivable not long ago and yet now through the miracle of DLT and DAO, was accomplished in 9 weeks!

Terminology

DAO - Decentralized Autonomous Organization
NFT - Non-Fungible Token (a unique, one of kind creation, that has value to select audiences)

This example makes a strong case for finding ways to achieve similar big, expensive objectives, and to develop ways to ensure what is being encoded is just and right in the Eyes of God. 


Feb27

Regarding artificial...

We do not have any reason to believe "machines will outpace us in everything within 15 years" (point 3 above) because, we humans, groveling on the earth CANNOT build such a machine, either by accident or intent.  It is not a worry for Believers, and part of the good news we are to rejoice in and tell everyone about. 

Let's stop worrying about it and concentrate on learning and following God's law. 


Mar18                       

Epicenter podcast 435 introduced me to Worldcoin.  This is "a start-up trying to engineer widespread adoption for its cryptocurrency by giving away tokens.  Sounds too good to be true?  Depends upon how you look at it (literally).  Participants must be willing to have their eyes scanned on registration by a proprietary iris scanner dubbed 'orb'.  Worldcoin says the scan is necessary to prove that users are human and to verify they only register once."  (quoted from webpage introducing the episode).

The founder and his team are another young group of extremely intelligent academic graduates with advanced degrees from Cal Tech and elsewhere that have been driven to create this project by their firm beliefs in egalitarianism, which is in essence the desire to "elevate" every person on the planet by removing worries about basic survival.  They seek to achieve Universal Basic Income (UBI) as a primary goal. 

Fortunately, the skepticism of the Epicenter interviewers came through loud and clear especially regarding the issues around maintaining the necessary level of security for the database of iris scans the project is collecting.  We discover during the interview that the greatest uptake of the tokens has been in African countries like Kenya and South Africa, and in South America.  It is very concerning that naive adopters willingly accept iris scans as the price of entry to receive their free tokens.  It also came out later in the interview that the usual suspects providing start up financing, are venture capitalists entangled with the WEF and with financial incentives not aligned with the early adopters. 

Dear readers - remember that my purpose here is to bring examples of technologies to the attention of Christians so that we may be aware of the onrush of new programmable ideas (algorithms) and be enabled to pray for understanding and guidance according to Christian principles.  Beyond that, an earthly purpose for some will certainly be to interact, audit, and help with that guidance if we are called upon to do so. 

Mar21

More thoughts on UBI here


Mar26

Is there a way to implement algorithms to check other algorithms for integrity?   Keeping in mind that machines can only solve problems they have seen before, some contend that via ML (Machine Learning), programs can be constructed that analyze algorithms for code that restrict the freedom of users somehow or force them into un-Christian paths, AND start to build a set of routines to handle every identified case.  (need examples... this is at least step 2).

Step 1 could be to select programs for further analysis based on meta data (key words / phrases) found such as:  Climate Change, UBI, CRT (Critical Race Theory or just Critical Theory), Artificial Intelligence, Equity (as opposed to Equality)...

How would this apply to Google maps and Google directions?  How would this apply to social media, or Amazon's control over network nodes where they can arbitrarily stop content they don't like from proceeding to its intended destination?  Can Web 3.0 solve this?

List as many examples as possible...

Submit suggestions to ALGO-integrity@compis.net

Apr 27

Implications of Elon Musk's buy-out of Twitter

    Mr. Musk has clearly stated that his major objectives in taking Twitter private are to bring back legal free speech, where illegal is limited to clearly threating or dangerous speech (such as yelling fire in a crowded theater), shutting down the bot manipulation of follower populations (whereby computer algorithms retweet posts to give them far more weight than the few human actors that set them in motion),  and open sourcing the algorithms that elevate or submerge tweets so that the biases that govern them are obvious and the egregiously biased algorithms can by removed.  This is a major positive development for Algorithm Integrity.  Of course, execution is critical and there are many challenges ahead. 

    To recap briefly, conservative glee at Mr. Musk's bold move against woke leftist speech censors has abounded since he first announced his just under 10% stake a few weeks ago.  Despite the Twitter insiders' raving lunacy over this billionaire's takeover vs their fawning worship of billionaire Jeff Bezos' takeover of the Washington Post a few years back, it became clear within days that Elon had the finances to get his way and complete his take private of Twitter no later than October, 2022 (the delay due to the complicated terms and conditions involved in a corporate takeover of this magnitude). 

    Now comes the hard work of implementing the changes he proposes.  He has to make the right new hires, replace the current crop of woke lefties driving and implementing Twitter censorship policies, deal with potential government interference on both the state and federal level, and meet the infrastructure challenges posed by Twitter's current dependence on the woke leadership of tech titan run companies.  Moreover, despite Twitter having
a tiny daily active user base compared to FaceBook, Instagram and other more widely used platforms, Twitter punches way above its weight because it is the opinion maker for all of mainstream media.  The vast majority of Twitter daily active users are from corporate media giants like ABC, CBS, NBC, CNN, FOX and MSNBC in the US, and CBC in Canada, BBC in the UK, ABC in Australia and others elsewhere around the world.  The screaming, and teeth nashing from these quarters is loudly calling for corporate and government action to preserve their ability to censor any speech they disagree with and control the narrative to the benefit of their left wing benefactors. 

    It is not widely known, but four gigantic tech companies, Amazon, Apple, Google, and Facebook have the ability to entirely shut down Twitter if they so choose, for no more reason than their woke sensibilities are upset.  Amazon can shut off their servers
(AWS - Amazon Web services) , Apple and Google can delete iPhone and Android apps from their web stores, and Face Book can censor any content they don't like.  This is no idle threat.  It actually happened with the Parler social media platform on January 7, 2021, the day after the democrat fabricated US Capital "insurrection". 

    Depending upon how thoroughly Twitter's code is integrated into AWS as a result of using many of their proprietary building blocks and tools, moving to other server infrastruction is much more than just taking an image copy of the code and porting it onto different hardware.  Much of the code would have to be rewritten, a time consuming and many month long effort.  When Parlor finally came back, it had lost so much market momentum that its market presence is now just a fraction of where it was on January 6th.  More evidence comes out every day showing how pressure from powerful interest groups, political parties, and governments has forced changes and shutdowns of content they deem politically incorrect. There is now little doubt that this was one several factors deliberately crafted by the democrat party to steal the 2020 US Presidential election. Elon has a big target on his back from Anonymous (a major computer hacking organization), from foreign governments, from the current US Government, and from woke Tech titans Bezos, Cook, Pichai, and Zuckerberg. 

    Next on the list of mountains to climb is the threat China poses.  What does Elon do if the Chinese government pressures him to take certain content off Twitter?  His bold and correct statements about the critical importance of free speech as definited in the US Constitution do not align with the way the Chinese look at and regulate speech.  There is much evidence of the concessions made by the NBA, Hollywood, and Disney to Chinese pressures to gain access to their multi-billion dollar markets.  Tesla too has made concessions to gain the benefit of cheap Chinese labor, subsidized factory incentives, and massive sea port infrastructure.  When it comes to maintaining the power of the Chinese leadership, China will stop at nothing.  A clash between China and Twitter is all but inevitable if Mr. Musk is able to implement the changes he seeks.  It would not be surprising to see Twitter banned by China as a result.

  Whether or not Elon Musk succeeds in his efforts to fix Twitter, his bold action and gigantic financial commitment validates Algorithm Integrity as being on the front line of the battle to restore the American Constitutional Republic.  Nothing is more important in this digital age.  Fixing the corrupt Twitter censorship regime will strike a major blow for individual liberty and freedom of expression in this titanic battle.  Way to go Elon!

May25

Proving Distributed Ledgers (Blockchains)

Introduction

Distributed Ledgers are databases that are duplicated on multiple computers.  Each separate computer in the network maintains an exact copy of the data in the ledger. To make this useful, there has to be a way to ensure (prove) that all copies are identical, and a way to add new data while still keeping all copies in perfect synchronization.  Distributing multiple copies of the same data provides failure resiliance should some computers in the network break down or go offline (because the data can still be recovered from the remaining operational computers), and prevents fraud by including a practical way to obtaining a unique verification number (a "hash" code) that can only be generated from an exact copy.

Since copying the entire database to every computer in the network is impractical because of its ever growing size (limited bandwidth and ever greater time to distribute each new update), distributed ledgers add new data in fixed size chunks or blocks, and chain them together, one after another as each new block fills up.  Each distributed ledger starts out with rules that specify the size of each new block and the algorithm for finding the unique verification number. 

Building a database as a distributed ledger has another major advantage - the ability to constantly prove the accuracy of the data without having to trust any single entity.  Once a new block of data is added and the verification number found by one computer on the network, a majority of other computers in the network can verify they are able to generate the same hash code and know that all the data accumulated so far in all the blocks continues to remain in exact synchronization. It is this feature that makes Distributed Ledger Technology (DLT) so attractive for Internet money. 

Three Proof Methods

Proof of Work (POW) -

All computers in the network independently run the verification algorithm to find a hash with a certain number of leading zeros by adding a random starting number (called a "nonce") to the block, running the hashing algorithm, then incrementing the nonce and re-running it until the output contains the required number of leading zeros (As of December 2020, 19 leading zeros in a 64 character verification number). 

Proof of Stake (POS) -

A randomly selected number of computers that each have met a staking requirement (ie: have put up collateral of a minimum value)  are selected to run the algorithm that generates the verification number for each new block.  This is less energy intensive because only a subset of the computers in the network expend the energy required to find the hash code.  Each new block verification is assigned to a new randomly selected subset of all network computers. 

Proof of Burn (POB) -

This is a variation of POS whereby computers seek the right to validate a block by spending the currency of the network rather than putting it up as collateral.  The randomly selected computers must fall within a group that has spent at least the minimum amount to be included in the selection group.  The result is the same as for POS - fewer computers are needed to run the verification algorithm and agree that the next DLT block is an exact copy, thereby reducing the total energy expended.

May26

Stable coins

A stable coin is a crypto currency that is designed to maintain its value equal to another another financial asset, usually a fiat currency like the US dollar. 

Crypto assets are very volatile compared to fiat currencies.  Their prices can move 10% or more in a single day compared to forex price movements between two fiat currencies that rarely move even 1% in a day.  For a stable coin to serve as a reliable medium of exchange, a mechanism is required to ensure convertability at parity with the linked financial asset.  Crypto projects usually ensure this by holding reserves of the financial asset equal to the pegged value of crypto tokens in circulation.  A stable coin valued at $1 with a billion tokens in circulation would require one billion US dollars held in reserve at a bank to ensure the holder can always convert between the two at parity, for example 1 USDT stable coin for 1 USD.

Stable coins have also been backed by other crypto tokens.  In this case, the volatility of of both tokens is such that investor confidence in full convertability under all conceivable circumstances requires over reserving the backing asset against the stable coin.  It should be concerning to holders of such stable coins that even reserving twice as much backing crypto as the nominal value of the stable coin in circulation can only protect against a 50% decline of the reserve currency.  In order to get investors to take this risk, such stable coin projects may offer very high interest rate returns compared to traditional banks for money invested or "staked" in their token.  Annual percentage rates of 5% up to as high as 20% have been offered for this purpose.  Such rates can partially mitigate the risk that the backing crypto may not maintain its value within the 50% range of parity.

It has also happened that crypto project leaders have addressed defaults or other impairments to token value by making changes to blockchain operation (called a hard fork) to deal with such disasters.  Their proprietary pride in their creation is a heavy motivation for doing everything they can to maintain confidence in the token of their project.  Etherium has done this more than once in the course of their short history. 

Finally there have been a few crypto projects that have attempted to fully guarantee stable coin convertability at parity without any reserves at all, only using algorithmic methods.  This type of guarantee has recently had a catastrophic failure.  On May 4th, the Luna token which was the backing token for the Terra UST stable coin began a precipitous decline that reduced its value so much that it could no longer meet its obligation to support the price of Terra at $1.  Terra tokens invested in Anchor Protocol, a crypto savings plan promising 18% a.p.r. "interest" could no longer be withdrawn and converted back to $1 US.  Anchor Protocol is now a total disaster. If one attempts to withdraw those tokens and convert to USD, it would result in a 99.99% loss of invested capital. 

Rescue attempts have been mounted by the Terra / Luna project but so far nothing firm has been put into place that promises a more substantial recovery.  Repercussions in the form of government hearings in Korea (where Terra / Luna is based), an
d unwanted attention from  trans-national organizations like the World Economic Forum have set back the progress of Distributed Ledger (blockchain) stable currencies to ultimately win out over centrally controlled fiat currencies.  Once again Algorithm Integrity pops up as the key discipline that must be mastered for long term success aligned with Christian values in our modern technology driven age. 

Nov11

Factors in election corruption:

1.  mail-in ballots
2.  extending voting beyond election day
3.  usage of machines that cannot be audited
4. 
change from precinct vote counting to centralized vote counting
5.  loosening or dropping voter ID requirements 
6.  unlimited campaign contributions
7.  unenforceable truth in advertising

    Election corruption is still rampant in American states - my former home state of PA leads the list.  In Pennsylvania, like in Michigan, Wisconsin, Nevada, Arizona, and other states, much of this can be traced to mail-in ballots, and extending election day to weeks before the actual day and sometimes accepting new ballots for days after the election.  This change makes it far easier to "stuff" the ballot box with votes without proper attribution to an actual live voter who is a citizen and a resident of the jurisdiction.  No amount of poll watchers, or election lawyers can close all the gaps available for cheating when election day becomes election season and ballots can be mailed in.   Americans have seen in Florida that these problems can be fixed because Florida, once the imfamous state of the "hanging chad" election (that delayed the result into December, 2000), finished their vode counting within hours of polls closing in 2022.  For the other states, too many politicians benefit from this corruption to close the loop holes quickly.   Arizona and Nevada are two states where this battle is in full onslaught right now.  Those in power are doing everything possible to delay and overturn the massive support for opposition candidates who made election integrity a foundation of their campaigns. 

   Voting machines have been a part of American elections dating back to the early part of the twentieth century.  The mechanical lever voting machine was already in widespread use across the United States in 1930 (I remember voting on them in the 1970s in Pennsylvania).  By the late 1990s, computer technology began to be integrated into voting machines as a result of the personal computer revolution that made small computers relatively inexpensive and widely available.  Already in the early 2000s disgruntled losing candidates began to blame their losses on faulty machines (both major parties did this). 
Much evidence of corrupted election machinery has been uncovered since the 2020 US Presidential election.  The My Pillow Entrepreneur, Mike Lindell has spent all of his spare time over the last two years hiring experts to investigate particular vendor products, holding televised symposia on the fraud exposed by those experts, and speaking out at every possible venue about this issue.  Many voters have become convinced as a result, that the only way to end corrupt voting machine fraud is to eliminate the machines altogether in future elections. 

    A precinct is an electoral district of a city or a town served by a single polling station.  When I was growing up in Western NY in the 1950s and 1960s, my Dad was a Republican precinct committeeman who supervised the counting of votes on the Republican side (his counterpart was the Democrat committee person who did the same task on behalf of the Democrat party for that polling station).  The advantage of this system was that each precinct had a well known quantity of voters that that could all be verified and counted within a few hours after the polls closed (largest precincts in 1960 New York contained 1000 registered voters).  With the advent of computerized and networked voting machines, the tabulating process began to stray away from the precinct level into larger and larger voting centers.   This obscured the chain of custody supervision performed by local committee persons, replacing it with opaque, black box transfers that are much harder to audit, and opened up another opportunity for fraud by bad actors.    Today, in Maricopa county in Arizona, registered voters total in the hundreds of thousands, and the centralized counting process has been extended already four days after the polls closed with estimates by officials that a final vote tally might not be reached until 7 to 10 days after the election. 

    The amount of money that politicians of all stripes are able to raise today has reached unimaginable sums.  Before the 2010 Supreme Court Citizens United decision that removed all limits to how much corporate and union money could be contributed (in the name of free speech), spending for entire election cycles for all state and federal candidates topped out at a few hundred million dollars.  It is now not uncommon to see tens of millions spent on an individual race for a national office like Congressperson or Senator.  Presidential races  have
now reached the billion dollar figure.  Despite the minor restriction that corporate and union money cannot be contributed directly to individual candidates, that money always finds its way to individual races after passing through the hands of political party bosses who entrench their power by the dollar support they can offer their favored candidates.  It all goes to wall-to-wall advertising campaigns on TV, radio, and Social Media, where anyone can claim anything, and each side has its own "fact checkers" equally ready to lie to the advantage of their particular side.  Which means it is ultimately controlled by a very few elite who invariably act to maintain their power. 

This excerpt from
Matthew Henry's Commentary on 1 Kings 16:15-28 shows such bad behavior by the elites has been going on throughout human history:

                                                   
    When men forsake God, they will be left to plague one another. Proud aspiring men ruin one another... Many wicked men have been men of might and renown; have built cities, and their names are found in history; but they have no name in the book of life.
                                               


    So as in virtually everything in our fallen world, integrity falls by the wayside when it becomes a choice between doing what is right and what most benefits those who have clawed their way to the top by a lifetime of making the same selfish choice at every opportunity.  We know from a myriad of Biblical stories that this has been a pattern throughout human history.  The only difference is today, we have enhanced our ability to corrupt by the leverage provided by our technical advances in every field.  For me, it is overwhelmingly evident that mankind's sinful nature can never be overcome by human acts.  Thankfully, by the Grace of God, Lord Jesus opened the door for us by paying in full for our sins if we only accept him into our hearts and strive to change our lives to be more like him every day we walk on this earth. 

    One day, the Biblical prophesy of Revelation will be fulfilled, but in the meantime, we must strive to stay on that walk and do everything we can individually do to spread the Gospel to everyone with ears to hear, to serve others with the talents and skills we have been given by God, and to raise up the next generations to find the same path for themselves.