Slashdot is powered by your submissions, so send in your scoop

 



Forgot your password?
typodupeerror
×
Bug Programming

This Year's Y2K20 Bug Came Directly From 'A Lazy Fix' to the Y2K Bug (newscientist.com) 160

Slashdot reader The8re still remembers the Y2K bug. Now he shares a New Scientist article explaining how it led directly to this year's Y2020 bug -- which affected more than just parking meters: WWE 2K20, a professional wrestling video game, also stopped working at midnight on 1 January 2020. Within 24 hours, the game's developers, 2K, issued a downloadable fix. Another piece of software, Splunk, which ironically looks for errors in computer systems, was found to be vulnerable to the Y2020 bug in November. The company rolled out a fix to users the same week -- which include 92 of the Fortune 100, the top 100 companies in the US....

The Y2020 bug, which has taken many payment and computer systems offline, is a long-lingering side effect of attempts to fix the Y2K, or millennium bug. Both stem from the way computers store dates. Many older systems express years using two numbers -- 98, for instance, for 1998 -- in an effort to save memory. The Y2K bug was a fear that computers would treat 00 as 1900, rather than 2000. Programmers wanting to avoid the Y2K bug had two broad options: entirely rewrite their code, or adopt a quick fix called "windowing", which would treat all dates from 00 to 20, as from the 2000s, rather than the 1900s. An estimated 80 percent of computers fixed in 1999 used the quicker, cheaper option. "Windowing, even during Y2K, was the worst of all possible solutions because it kicked the problem down the road," says Dylan Mulvin at the London School of Economics....

Another date storage problem also faces us in the year 2038. The issue again stems from Unix's epoch time: the data is stored as a 32-bit integer, which will run out of capacity at 3.14 am on 19 January 2038.

This discussion has been archived. No new comments can be posted.

This Year's Y2K20 Bug Came Directly From 'A Lazy Fix' to the Y2K Bug

Comments Filter:
  • Lazy Vs. Disaster (Score:5, Insightful)

    by SuperKendall ( 25149 ) on Sunday January 12, 2020 @01:43AM (#59611364)

    I actually have a lot of sympathy for people that just implemented the windowing solution at the time, rather than trying to change a bunch of probably very fragile systems in a short timeframe.

    However, then those companies had 20 years to shift off windowing!! That is a hell of a long time to add two more digits of storage per year kept and adjust whatever systems broke from that change. So anything that breaks now, I just can't have much sympathy for.

    And side-note, was Splunk even around in the year 2000? It seems really funny to me they would have this in an issue in the software they have built.

    • Re:Lazy Vs. Disaster (Score:4, Interesting)

      by asackett ( 161377 ) on Sunday January 12, 2020 @03:00AM (#59611450) Homepage

      It's a big pain in the ass to futz the fixed-width flat-file databases to shift every damn thing past every date two digits rightward in COBOL. Somewhere, somehow, some monkey wrote his own parser. It's Never Easy.

      • Re:Lazy Vs. Disaster (Score:4, Interesting)

        by LordWabbit2 ( 2440804 ) on Sunday January 12, 2020 @03:39AM (#59611506)
        Not just that, but then it's also a data migration as well as a code change. There is also a size limit per row, and if the file is already smack on the limit where you going to get the two (or more) bytes from? People who only work with shiny new programming languages and have LOTS of dynamic space to play with don't realise the problems adding two bytes can be to legacy code. And no, the solution isn't to just rewrite everything into shiny xyz because it's a LOT of code, and a full system rewrite is not in the budget. So you get kludges, like windowing. Kick the can down the road. Hope like hell by then COBOL will actually finally be dead, and make sure you no longer work there when you get to the can.
        • by jythie ( 914043 )
          Not to mention caching behavior. Your row fits nicely into 4k, everything behaves as the carefully constructed system intens. 4k + a few extra bits, caching becomes nonsense since nothing fits within the allocators design anymore.
          • If you created a 4k allocation and didn't leave a couple bytes for an API change, then you are a shitty programmer. Please let someone qualified fix this; don't give it back to the guys who fucked it up in the first place as they are fucking idiots.

            • by jythie ( 914043 )
              when every cycle and every byte matters, leaving extra 'just in case' does not always make sense, esp when 'api changes' are not on the table in the first place. The particular project I was working on, the API was not going to change, period end stop. The use case did not allow for it, and a lot of stuff designed to last decades is like that.
      • by jythie ( 914043 )
        Years ago I worked on a FORTRAN system that suffered from this problem. I doubt they ever fixed it, I shutter to think of what a nightmare it would have been to even try since it would have not only been a multi-language code change (data exchange between FORTRAN and C components using twice defined structures) but decades of data would have to be reprocessed from the old format into the new one. Just... an utter nightmare.
    • The reason the window only extended to 2019 is simple -- values beginning with '20' were assumed to be 4-digit years followed by 2 more digits, while values beginning with 21..99 were 19xx and 00..19 were 20xx.

      It wouldn't surprise me if lots of software actually started having problems in 2019, by making the same assumption that a year string beginning with '19' is a 4-digit year as well.

      As shocking as y2k-complacency seems now, even circa 1997, it was common for Perl programmers to display years by prepend

    • by jythie ( 914043 )
      I suspect what happened in many cases is that companies discovered the interlocking nature of their systems meant that incremental updates just were not possible, so it became a 'window or replace everything', and they went with windowing under the assumption that they would be given time and resources for a real fix once the pressure was of... but then they money never materialized.
    • Perhaps you should dig a bit deeper into the problem.

      There is no real fix.

      That is why we had about four solutions, with their unique problems.

      E.g. how to show it on the screen? Hidden dates in "other numbers", like a contract number.

      E.g. 100-05-255, department 100, 5th year after founding of the company, day number 255. Or other silly things. migration of old data on tapes etc. Having an old terminal, someone writes a string ending in 98 at certain position, another one knows there is a date and scraps if f

    • What we learned from Y2K was that proactively fixing problems is bad. You frequently hear comments to the effect of "Y2K was not a problem - why was so much money spent on it?". The obvious answer is that Y2K was not a problem because so much money was spent on fixing it but the bean counters are now understandably reluctant to incur cost for what they see as future non-problems. For the record, the same applies to politicians - never fix a problem before it's a crisis and you can take credit for it.
  • by Gravis Zero ( 934156 ) on Sunday January 12, 2020 @02:08AM (#59611386)

    The windowing solution wasn't a bad solution per se as it was expected that the computers/software would be replaced in the next 20 years. The fact that it wasn't and that it was never fixed demonstrates pervasive mismanagement of assets throughout the business world. They had 20 years to correct these issues but chose to never address them.

    You reap what you sow.

    • As I recall, this was all more or less coincident with the dot com boom. All those coders probably quit their boring salaried jobs to instead get rich, just as soon as their in-lieu-of-salary shares of Kozmo.com vested.

      • All those coders probably quit their boring salaried jobs to instead get rich { ... }

        That was coincident with but not part of Y2k. On the technical front, Y2k was straightforward enough but tedious as hell, so the sorts of people who could groove on that went for it. If'n ya knows programmers, the rest is easy to figure out.

        • We had a specialized compiler/data flow analysis and code visualization and automatic transform system ...
          Nothing tedious about that ... it was actually fun to fix a million line of code in 4 weeks with 4 people working in a kind of pipeline system. I think top performance was about a million line of code per week, but the stupid industry in Germany did not believe us, and rather hired consultants that costed twice as much as we costed and took 50 times as long to fix it. (We did in weeks what others did in

      • by jythie ( 914043 )
        I've seen some models suggest it might have even added fuel to the boom. All of a sudden big companies were pouring money (and debt) into fixing a problem, which meant rising salaries and lots of new people entering the field, and that capital/skill turned around and fueled other things. But after y2k, those faucets turned off.
    • They had 20 years to correct these issues but chose to never address them.

      The same thing will be said in a few years about climate issues. This how it works. People rely on what they can enjoy at this very moment.

    • The windowing solution wasn't a bad solution per se as it was expected that the computers/software would be replaced in the next 20 years

      If the software had to face y2k, it was already old stuff in 2000. If the problem still exists based on the y2k fix, that means this is [a] really old [p.o.s]. Bad software that shouldn't be used nowadays.

      • It canâ(TM)t be that bad if itâ(TM)s still in use after more than 20 years. Surely itâ(TM)s still solving a problem, most of the issues and quirks are understood, more importantly: whoâ(TM)s going to pay for a complete re-write and yearâ(TM)s more of discovering newly created issues and maintenance costs of fixing them? Want to bet that a nice shiny rewrite with todayâ(TM)s faddy tools wonâ(TM)t even be runnable in 20 years?

        • by jythie ( 914043 )
          Well, partly this showed why it was bad. Even if it worked, the code was brittle and difficult to adjust when a problem is discovered. Y2K was a high profile example, but countless other ones also tend to exist. Scaling issues tend to be a major one for instance, old code that is being asked to do more and more over time and gets glitchy when the volume is too high.... and is again difficult to go in and edit in order to address problems.
      • Comment removed based on user account deletion
      • The biggest software we fixed was a very simple thing: a payroll system, that prepared bank transactions and calculated the taxes and other fees on it. A company ran it as a contractor for a dozens of other companies. That was the only thing they did, or could. WTF would anyone throw that software away, and write it anew for dozens of millions of dollars when I with my team can "fix" it with "sliding windows" forever, for 1 million?

    • by imidan ( 559239 )

      The windowing solution wasn't a bad solution per se as it was expected that the computers/software would be replaced in the next 20 years.

      You might as well make the same excuse for the original Y2K problem. Using 2 digits wasn't a bad solution per se as it was expected that the computers/software would be replaced in the next [n] years.

      Of course, if you "fixed" Y2K by windowing only to have your solution blow up in your face 20 years later, it demonstrates that you didn't learn your lesson even given two opportunities.

      • by Kjella ( 173770 )

        You might as well make the same excuse for the original Y2K problem. Using 2 digits wasn't a bad solution per se as it was expected that the computers/software would be replaced in the next [n] years.

        If computers had stayed the way they were in the 1970s it's entirely possible they'd have stayed on two digit years forever and manually managed the switch between 00-99 and 50-49, for most businesses a 50 year operating window is ample and those bits were precious. To use a car analogy in year 2000 the same consideration would be like "Hey I'm charging my Tesla off this grid, can I add a LED light bulb too or would that put too much of a strain on the system?". Sometimes having the necessary resources is w

        • by imidan ( 559239 )

          Oh, yes, and it was perfectly reasonable to use only two digits back then. And, fortunately, enough people were aware of Y2K when it was coming up that they were able to raise awareness enough for something to be done about it.

          Today, ISO 8601 dates have the flexibility to add additional year digits, so 9999 isn't even the limit. But even if humanity lasted until 9999, the way we number months and days surely will have changed, making 8601 obsolete before that extremely forward-thinking feature is ever used.

          • Comment removed based on user account deletion
            • We already use multiple calendars around the world. It's almost inconceivable to think that over the next 8000 years, no one will add support for the Hebrew and Chinese calendars to the majority of major software. Once you have support for multiple calendars, it's an easy step to change the calendar to something more sensible than an imprecise calendar centered around a Christian myth.

              Hell, think what happens if we get people to Mars. It would take a vast amount of willful ignorance to try to port he Earth

              • change the calendar to something more sensible than an imprecise calendar centered around a Christian myth

                The modern calendar (the Gregorian calendar) isn't centered on Christian mythos at all, it's just a refined version of the Julian calendar. And it's based on the orbit and revolution of the Earth. Leap seconds are a result of slight deviations in the orbital period, and leap days are a result of the rotational period not precisely aligning with the orbital period.

          • the way we number months and days surely will have changed
            Extremely unlikely. Except for a few exotic ones e.g. Mayas, we do that since 6000 - 8000 years, why would we change anything in the near -oops- future?

        • by Sique ( 173459 )
          And I am sure that in 7979 years, pundits will write long editiorials and smug nerds will comment that all businesses had ample time to prepare for Y10K. And still, there will be a large effort to fix Y10K before New Years Eve.
        • The idea that people used 2 bytes, aka 2 digits to save space, is an internet myth.

          People aka programmers used two digits: because on every paper they dealt with, checks they wrote, or received or other things except a birth certificate: had only two digits. For most programmers it was simply natural to use two, digits, still in the late 1980s .... because the idea: oops that will blow us later in the face, did simply not occur to anyone, except the "alarmists".

          That is how the human mind ticks, just doing t

      • Comment removed based on user account deletion
        • by jythie ( 914043 )
          Making it even worse, the y2k period saw a massive influx of very well paid but inexperienced people, who probably did not stay in the field once the 80-100$/h paychecks went away, or were outright fired when the projects closed.

          A lot of places did not put their best and brightest on the problem, or their long term people.. they hired a bunch of contractors to work on it and then fired them all.
    • Comment removed based on user account deletion
  • All right, how many thought "let's slide the window, 20 to 40" ?!
    • Well, they could have implemented a sliding windows, in 2001 => 2021, etc...
    • by Njovich ( 553857 )

      Realistically that is the best solution for many systems if a full rewrite isn't an option.

    • by swilver ( 617741 )

      Not me, I'd go for 60

    • Comment removed based on user account deletion
    • Basically everyone who did _professional_ Y2K fixing, it was the standard method.
      Or do you mean "a fixed window"? Most used sliding windows, every year the lower bound and the upper bound moved one year forward.

    • All right, how many thought "let's slide the window, 20 to 40" ?!

      I'm actually thinking about the conversation they had:

      Flunky: "What's the earliest possible legitimate date we'll ever encounter? 1960? 1950? 1940?"

      Tech lead/PHB: "Oh, damned if I know, it can't possibly be before 1920. Use that, check the box, and move to the next project."

    • All right, how many thought "let's slide the window, 20 to 40" ?!

      I thought of that and immediately rejected it. As the historical record being handled expands, you start running into significant past times that are no longer representable, "eaten" by the advancing window.

      (Note that it's not safe to advance the window to about the start of computing plus 100 years minus a couple for flags: Some databases were converted from pre-computer written records with pre-computer dates.)

      People are living longer now,

  • bad generalization (Score:5, Insightful)

    by bloodhawk ( 813939 ) on Sunday January 12, 2020 @02:57AM (#59611448)
    While windowing was a bad fix for a few systems, it was a great fix for a shit ton of systems. Many systems that had this fix or similar have now been decommissioned, millions saved in unnecessary work.
    • While windowing was a bad fix for a few systems, it was a great fix for a shit ton of systems

      Sure. But that also means the software had to be replaced / deeply changed in the years after y2k. So, quick-fix to sleep well between 99 and 00, then, the usual "it works so we forget about it". The ugly "don't fix what's not broken" we still hear, usually from people not so tecky, who might nevetheless have decision-making power, unfortunately.

      • many of the software/systems was long past its used by date, it just needed to be able to hang on for a few more years.
      • Yes, the infamous mantra of the techno-conservative, "If it's not broken, don't fix it". While that's a good policy in some instances, it ignores the effects of slow change, which might range from the bit rot in an optical disk to web servers that work fine for hundreds of users but break down when hit by tens of thousands of users (the formerly so-called Slashdot effect). In such cases, it's often better to anticipate the change, rather than rely on the highs and lows of past statistics. Is it that hard t
      • by Calydor ( 739835 )

        It also-also means the changed software was changed without a literal deadline hanging over the programmers' heads, which ideally should mean fewer bugs.

      • by BeerCat ( 685972 )

        While windowing was a bad fix for a few systems, it was a great fix for a shit ton of systems

        Sure. But that also means the software had to be replaced / deeply changed in the years after y2k. .

        For some systems. Where the company no longer existed / was bought by another (and everything moved onto the other company's system) / old system was replaced, then the "bad fix" was just enough.

        What is left is the "old system not replaced" (which might include company bought by another, but did not integrate into the current system. Or even worse, buying company had a 2020 issue themselves), which is a significantly smaller subset of those that were needing to be fixed for Y2K

      • by jythie ( 914043 )
        Also do not forget that the y2k period was flush with cash, while not long after there was a crash which drastically changed budgets. They might have gone in with a perfectly reasonable plan based off current income and resources, then had it smashed.
    • Exactly this, windowing worked just fine for a LOT of systems that had to be fixed rapidly. A lot of those have been fixed properly and a lot more of them have been retired completely. What we see here is the few that did the windowing fix and walked away. Probably done by contractors who walked away with their paycheck and left no one the wiser to the time bomb left behind.
  • The first example is the worst one. A game called 2020, marketed for 2019 and 2020, does not work in 2020. This one is not old software that was patched for the year 2000 bug.
    • You missed the best part: the game has 2020 in the title, demonstrates the problem with the window solution to the Y2K bug, and is published by a company called 2K.

  • Really? (Score:4, Interesting)

    by ledow ( 319597 ) on Sunday January 12, 2020 @08:16AM (#59611908) Homepage

    Honestly, just stop pissing about.

    64-bit minimum for everything. We aren't limited by RAM etc. nowadays, you can use the largest primitives available and that means 64-bit on anything in the last 10+ years, and "emulating" those on older platforms is incredibly trivial and we've been doing that for years too.

    64-bit gives you until the year 292,277,026,596. Not just "let's knock this down the road a bit", but "let's solve this for the rest of the existence of the human race". For the cost of another 32-bit number in a tiny handful of places. That's 4 bytes. On machines with GIGABYTES, literally billions of bytes of RAM, storage, etc.

    We have no need to assembly-optimise a tiny workaround to ridiculous extremes any more.

    I also contest that everything should be 64-bit minimum: RAM size counters, file size limits, basic integers holding the number of USB devices, whatever, who cares? Just make it 64-bit. There are 128-bit supporting filesystems out there already.

    Rather than waste our time, time and time again, just make everything 64-bit minimum and solve the problem for the rest of humanity's existence.

    • by Cederic ( 9623 )

      Why would I waste so much bandwidth, RAM and storage space by mandating that everything is 64 bits?

      We didn't spend decades on signal processing, efficient network protocols and storage compression algorithms to just throw it all away now.

      Sure, sometimes you have a machine with gigabytes of RAM. Sometimes you have a machine with kilobytes, sometimes you have 14 hours of latency on the network, sometimes you want to save exabytes of storage overheads.

      Those are real world problems in 2020. Mandating a blanket

    • Re:Really? (Score:5, Insightful)

      by jythie ( 914043 ) on Sunday January 12, 2020 @10:13AM (#59612074)
      That works fine until you have lots of data, or are fitting things within a caching system. The difference between 'this block of records fit perfectly within this 4k page' and 'these block of records does not' can mean the difference between humming along swimmingly and grinding to a halt.

      People also tend to forget just how wasteful 64bit tends to be when dealing with data that is much smaller than that. Doesn't impact desktops, cell phones, or web servers much, but when you start getting into number crunchers it can really add up. I work on software that is still 32 bit because we discovered that even with modern hardware moving over to 64 bit is just too big of a memory/cpu hit.
      • If 4k is your performance limit, and you didn't leave 2 bytes of unallocated space, you're a shit developer.

        • by jythie ( 914043 )
          Or a developer who understands the actual problem they are solving and the constraints they are working within. It is easy to say 'just leave a few bytes extra' when you don't know anything about the actual structure of the data or how it is processed. But that is also the hallmark of a shitty developer, talking solutions when they do not even know the problem.
    • and solve the problem for the rest of humanity's existence.

      A) Do you REALLY want to rewrite all that existing "legacy" code, databases, and the hardware that that still runs it? And so you're going to pay for the code changes AND economic AND hardware disruption, right?

      B) You mean 12 years, right? Link [insideclimatenews.org]. And there are a million others. Also this one [cnn.com], where they're taking DOWN the "in 12 years ..." signs.

      I'm (was) a Command Level CICS programmer back in 1986. Using 4K of RAM was an extravagance. And like the web, your program did it's your job, "saved a co

    • 64-bit minimum for everything. We aren't limited by RAM etc. nowadays, you can use the largest primitives available and that means 64-bit on anything in the last 10+ years, and "emulating" those on older platforms is incredibly trivial and we've been doing that for years too.

      You're still thinking too small. The smallest possible unit of time is the Planck unit. There are approximately 5.39 * 10**44 Planck time units in a second. Even so, it only takes 202 bits to count every single Planck time unit since th

  • Arriva busses in the UK. Payment systems packed in on some of the buses in my area at the start of the year - drivers just waved everyone through after the QR-code readers were rejecting tickets. They fixed it last week though, so all good. I can't find any other reports of it happening though, so maybe it was just local. I spoke to one of the drivers and he just said all the buses he had driven for the first week of the year were affected.

  • by AndyKron ( 937105 ) on Sunday January 12, 2020 @08:45AM (#59611952)
    Who the hell buys a "professional wrestling video game"?
    • Who the hell buys a "professional wrestling video game"?

      Same people who buy a fifa game, or a nba game.

      Or the friends video game:
      https://www.amazon.com/Friends... [amazon.com]

      thast is to say, people who like the wwf franchise for whatever reason. Be it for the "sporting" aspect, the "soap opera" aspect or the larger than life aspect.

  • I faced the Y2K as fresh faced CS graduate and even with my limited experience had the foresight to turn the two characters/bytes into an unsigned short integer of 16 bits.

    I guess I should not be staggered with hindsight given how much bodged trash code I've seen over the last 20 years, usually produced by those who also claim a Computing degree is a waste of time.

  • by skullandbones99 ( 3478115 ) on Sunday January 12, 2020 @10:08AM (#59612064)

    Unix / POSIX time is stored in a 32-bit signed integer (summary incorrectly said integer) on a 32-bit operating system. The issue is that the 31-bit integer part overflows causing the signed bit to be set resulting in time jumping backwards to 1902 because 31 bits can represent about 68 years. 1902 to 1970 = 68 years, 1970 to 2038 = 68 years.

    Y2038 will cause systems to use the wrong Unix EPOCH era and misbehaviour is likely. Typical problems with be timer events that span over the Y2038 discontinuity because the timer may never expire or expires way in the future. If the time period is relative then a system reboot would "fix" it but any absolute time calculations are likely to fail.

    Y2038 is not limited to just the Unix time clock because various sub-systems and protocols use timestamps. This means that using a 64-bit operating system is not a full solution as external devices and communication links can still have Y2038 issues. In other words, Y2038 is also a communication failure risk between systems.

    People should also be aware of the Y2036 rollover of NTP (Internet / Network Time Protocol) that many devices rely on to get their system time and date. This is likely to impact IoT devices. The Y2036 issue occurs because the NTP era moves to the next era. Mitigation is possible if the device keeps track of it's shutdown time and date to prevent the NTP calculations overflowing. However, Y2036 in combination with Y2038 is likely cause systems with no RTC (Real Time Clock) to boot-up with incorrect times and dates.

    Y2038 is likely to hit the automotive industry because vehicles last 20 years and Y2038 is just over 18 years from now.

    Ask yourself why you cannot set your Apple or Android smartphone past 2038, these devices are restricted to a max date of 2036 to avoid Y2038 failures.

    I suggest people look out for the Y2038 Ready stickers like occurred with the Y2K fixing campaign.

  • 2038 bug only really applies to 32 bit machines. If you're still running 32 bit machines, you are completely at fault for letting this bug present itself.

    Just like those who chose a windowed implementation to "fix" the issue.

    • It applies to filesystems too, ext2 and ext3 filesystems use 32-bit dates, can't handle dates after 2038.

      Funnily enough, the ext4 filesystem fixes this with another 'Lazy fix' - adding 2 extra bits for date, which just pushes the problem out by about another 400 years.

    • Please stop propagating the myth that 64-bit machines are immune from Y2038. This is untrue. Use of 64-bit machines probably gets you 95% to resolving the issue. But you can still suffer some Y2038 issues on 64-bit machines. It is true to say that unfixed 32-bit machines are 100% susceptible to Y2038.

      64-bit operating systems use a signed 64-bit integer for system time which resolves most but not all of the Y2038 issues.

      The Linux kernel (since about v5.1) has added a parallel set of system calls to allow 32-

  • by emacs_abuser ( 140283 ) on Sunday January 12, 2020 @11:40AM (#59612276)

    All this bull about 2 and 4 digit years is just that, bull.

    How hard is it to look at the current year and from that decide where the window is. That's the way I did my Y2K fixes and the user gets to continue entering 2 digit years and the fixed code is immune to even the Y10K problem. If your dates can be safely treated as within a 99 year window there is no need for using 4 digit years.

  • Did they really say the Y2038 problem will occur at Pi AM, 3:14 in the morning?

    I assume they mean in UTC so it won't be in my timezone but still.

    Actually, I'm surprised we didn't hit this in 2008 when financial systems needed to start computing the end date of 30 year loans. Maybe COBOL is saving our asses because they don't do the computation using time_t.

  • The smart young programmers actually implemented the window solution with a 21 year window.

    They're still working at the same companies where their managers are reading about the Y2k20 bug and giving them big bonuses for having the foresight to fix the code "properly" 20 years go. In the fall they'll be putting in their retirement papers thanks to the bonuses.

Good day to avoid cops. Crawl to work.

Working...