Election Tampering: Y2K Fears Redux?

For the last few days, I’ve been reading reports on the Trickbot takedown. U.S. Cyber Command and Microsoft have been hitting the large botnet, called Trickbot, that is controlled from eastern Europe, most likely Russia, and appears to have been maneuvering to interfere with the November 3rd U.S. election. The takedown steps apparently were planned strategically to give the botmasters little time to rebuild before the election. I sincerely hope the strategy succeeds. And I hope, and believe, that the Trickbot takedown is only the tip of an iceberg in a battle that is freezing out cyberattacks on our election.

We survived Y2K fears.

Trickbot

Trickbot, a botnet, is a multi-purpose covert criminal supercomputer cobbled together from thousands of hacked Windows computers. The botnet’s design offers few clues to hack victims that their devices are secret participants in criminal cyber attacks. The Trickbot crimes are mostly ransomware exploits for illegal profit. For some background on botnets see Home Network Setup: Smart Kitchen Crisis.

Y2K fears

The reports reminded me of Y2K fears, the year 2000 computer scare of 20 years ago. I hope those efforts are as successful as the Y2K remediation, which were so successful, Y2K was called a hoax by those who did not understand the computer industry.

Y2K and Bolivian basket weavers

I remember the Y2K affair well. It was no hoax. Everyone in the computing industry knew trouble was coming. Already in the early 1980s the issue was a hot topic among engineers. The problem went back to the early days of computing when data storage was expensive. It’s hard to believe today, but in the early days, the fastest and most reliable computer memory was hand-crafted by weavers from copper wire and tiny donut shaped ferrite magnets called cores. My meatware memory is not entirely reliable, but I remember hearing that Bolivian basket weavers were recruited to manufacture woven core memory. The computers on the Apollo moon mission were based on hard-wired ferrite cores.

Today, we talk about terabytes (trillions of bytes) but in those days, even a single K (1012 bytes) of memory cost thousands of dollars. I guess everyone today knows that a byte is eight bits, a sequence of eight 0s and 1s. Each donut magnet in core memory represented a single 0 or 1. The costing rule of thumb for handcrafted core memory was $1 a bit. At that price, a terabyte of memory would cost 8 trillion dollars, roughly the market price of 8 Amazons.

In the 1960s and 70s, programmers did not waste storage. They saved a few bytes by storing the year as two digits. 1913 was “13” and 1967 was “67.”

Most business programs used “binary coded decimal.” Storing the year as 2 digits instead of 4 was a savings of 8 bits: at near eight bucks a bit, close to $70 in 2020 dollars. Put another way, today, the price of those two 1970 bytes will buy 2 terabytes of flash memory, delivered to your door by Amazon. That flash memory would have cost 16 trillion dollars in 1970, filled several warehouses, and probably generated enough heat to warm New England for the winter.

Dates and the year 2000

Dates are used heavily in business computing, somewhat less in scientific computing. Accounting, scheduling, supply chain management all depend on date calculations. Today most of these calculations are handled in a few standard code libraries that are used over and over, but those libraries did not exist when the programs that ran the world’s business in 1999 were written. Each time a program needed a date calculation, a programmer wrote code.

Man, did they write code. Programmers delight in rolling their own, writing their own code. Coming up with an entirely original mundane date calculation will make a skilled coder’s heart sing. And there is joy in writing code that looks like it does one thing and does something quite different. These tastes decree that given an opportunity, coders will come up with many obscure and abstruse ways to calculate and analyze dates.

When I was hiring coders, I challenged candidates to describe an algorithm to determine the late fee on an invoice payment if 1% was to be added for each calendar month that had passed after the date the invoice was cut. If you think that description’s a little vague, you’re right. A hidden part of the challenge was for the coder to determine exactly what I had asked for. I don’t believe I ever got two identical solutions, and some would have behaved wildly if the calculations had crossed the Y2K boundary.

The industry took Y2K seriously

For good reason, in the late nineties, the industry got serious about the approaching crisis.

Y2K was a big deal. By 1995, I, along with many of my colleagues, had intentionally forgotten how to code in COBOL, the mainframe programming language of most 20th century business programs that were at the heart of Y2K issues. I won’t say that COBOL is a bad language, but few programmers cared for its wordy style. The scarcity of COBOL programmers elevated the language to a money skill in the last days of the century.

Y2K in my products

At that time, I was managing the development of a service desk product that I had designed several years before. I was coding almost exclusively in C++ on Unix and Windows, which uses an internal representation for dates and times that would not, in theory, have Y2K problems.

Nevertheless, management, who didn’t know that our product was theoretically immune to Y2K, declared my Fiscal Year 2000 bonus would depend on the absence of Y2K errors in our code. I smiled when I read the letter that described my bonus terms. Most years, fickle market conditions that I could not influence decided my bonus. This time, my bonus was in the bag due to a wise previous decision to build on a platform that sidestepped the central Y2K issue.

Still, I don’t mess around with my bonus. I started feeding Y2K use cases into our test plans, just to be sure.

I’m glad I did. Management, bless their bottom-line focused hearts, were right to worry about Y2K. It’s been a long time, but I estimate my team spotted and fixed a dozen significant Y2K defects that could have brought the product down or caused crippling errors.

Our defects were not COBOL style issues, but they stemmed from the two-digit year mindset that then pervaded business coding. For example, serial numbers often embed a two-digit year somewhere. A clever developer might use that fact and create a Y2K defect from it.

If those dozen defects had kicked in simultaneously on New Year’s Eve, service desks would have begun failing mid-Pacific in a pattern that would repeat itself as New Year’s Day followed the sun around the globe to Hawaii the next day. That was in a product running on platforms that were supposedly immune to Y2K errors. The devil in it was that a service desk would be used to manage the response to incidents that arose from other Y2K system crashes, compounding the chaos.

A real threat

People who are not responsible for building and maintaining computer systems probably don’t realize what happens when a raft of defects appear at the same time. Troubleshooting the origin of a malfunction caused by a single mistake can be difficult. With two mistakes, the problem becomes more complex and confusing. Each added source compounds the difficulty. At a certain point, the system seems to behave randomly, and you want to delete the whole thing and start over.

In the bad old days, we often saw large software projects break in dozens of places at once when we fired up them up for the first time. We used to call it “big bang” testing. Since writing code is more fun than testing code, many developers embraced the methodology and put off testing. But untested code is buggy code. Those big bangs were intractable messes of defects that could take weeks and months to untangle. We soon learned to test small units of code early and often, before mild-mannered projects became monsters.

As testing proceeded in development labs, engineers began to recognize that Y2K threatened to be a mass conflagration like those big bang debacles. Worse. Y2K bugs were everywhere. Their corruption extended to hundreds of systems. Unremediated Y2K threatened software mayhem like I have never seen and hope to never to see.

The reaction

Some engineers seized the limelight by overreacting. Doomsday prophets got wind of what was happening in development labs all over the globe. They went to the media with predictions of the imminent collapse of financial systems, communications, and power grids, which threatened to halt the economy and provoke the mother of all economic disasters. ATMs and traffic lights were about to go out of control. The preppers stockpiled guns, freeze-dried chili, and toilet paper enough to isolate for months.

Y2K at CA Technologies

I checked in to the CA Technologies (then Computer Associates) development lab in Kirkland on the Seattle east side early in the morning of December 31, 1999. As the senior technology manager in Seattle, I had orders to gather a team of developers and support people to man an emergency response hotline. The idea was that any Computer Associates customer with any Y2K problem could call the hotline and get expert help. Most engineers thought this was a transparent marketing ploy to take advantage of fears that had been whipped to a frenzy by the doomsday crowd.

Highly publicized orders were issued that development teams were on the hook until the last Y2K customer issue was resolved. The company supplied special Y2K tee-shirts. A buffet of sandwiches and periodic deliveries of pizza and other snacks were set up in a large conference room we called the board room. A closed-circuit television feed from headquarters in New York was beamed onto a then-rare wall-sized flat video screen. Rumor said champagne was scheduled to arrive at midnight.

The big letdown

I honestly can’t remember if the bubbly ever appeared. Late in the afternoon, I told all my crew they could leave if they wanted. I had to stay until midnight, but there was no reason to spoil their New Year’s celebration.

Why? On the Pacific Coast, we were among the last time zones to flip to 2000. In the morning, a customer called to headquarters in New York with a minor problem in Australia. It was fixed in minutes–mostly a user misunderstanding as I recall. The Kirkland team was not called on once. Typical developers, the crew loaded up on free sandwiches and pizza, took the loot to their cubicles and silently worked on code. A few salespeople wandered in to check on the action; there was none.

After all the buildup, why the big meh? Because humans aren’t stupid. The industry responded to the danger with testing and fixing. I’ve seen and believe estimates that upwards of $200 billion were spent on Y2K remediation in the 1990s. That was money well-spent. Consequently, Y2K came and went with barely a ripple.

The Y2K hoax

I take it back about humans not being stupid. We immediately began to hear about the Y2K hoax, a conspiracy and scare tactic for whatever purpose the speaker or writer found convenient. I’m sure the loudest criers of hoax were the same loudmouths who screamed computer Armageddon. I’d like to roll back the calendar and give the world a taste of what would have happened if Y2K had been ignored.

Actually, I wish Y2K had been ignored outside the industry and the people who understood the problem were allowed to quietly fix it without all the noise.

But that wouldn’t be right either. We were not heroes. The cost of the Y2K remediation was the price of poor judgement. Acting as if it did not occur would only encourage future bad choices. The remediation was nothing to be proud of. The industry should be called to account.

Nonetheless, in my dark moments, I have no patience for people who broadcast opinions but don’t carry water, put out fires, and make things work. Not everyone was fortunate enough to be in on the action of Y2K, not everyone has the training and experience to know what it takes to keep our computer-dependent society viable.

Y2K and the general election

Which brings us back to the Trickbot takedown. November 3rd 2020 has begun to smell to me like the approach of January 1st 2000. I see real danger and a dead serious response. I’m not an active member of the cybersecurity community, but I keep up. I have no doubt that criminals, extremists from every corner of the political spectrum, and foreign nation-states are planning cyber attacks to extort payments from election agencies, stop people from voting, slow vote tallies, and question results.

Election tampering hoax

But I also see the seriousness and competence of the efforts to prevent and neutralize these bad actors. Some signs point to success. Already, two weeks before the election, 29 million voters have voted, almost five times the number that had voted at this point in 2016. I’m sure election day will be tough, but I will not be the least surprised to hear another big meh on November 4, followed by cries of “election interference hoax!” from every direction, but from my vantage now, it’s clear it is no hoax.

But I hope it looks like a hoax.

3 Replies to “Election Tampering: Y2K Fears Redux?”

  1. Marv
    Nice piece of thoughtful analysis and comparison. I recall the efforts to address Y2K issues at the Library of Congress. A few of the systems staff and collections management specialists spent the night of 31 December 1999 sleeping in the Great Hall just to be ready to respond to any emergency. Fortunately they passed the night without incident. Hopefully we will also weather 3 November 2020 as well.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.