I certainly do NOT mean to be offensive, particularly to Fred, since I'm pointing this out on your post.
I worked in the IT industry nearly 40 years, Y2K was a Non-Issue because so many IT professionals spent long months reviewing (modifying & testing) computer program logic, long before that night arrived. In our Hospital Environment, we averted major problems by following this course of action. I was aware of Many other industries, all doing the same thing.
Hence the Appearance of "a lot of nothing" (and that was our goal to make it that way!!!).
Many folks who make these comments have little or no understanding of the problem. Many of these systems were coded in the COBOL or PL-I programming languages. Because most programmers, particularly in the 70's, 80's and yes, even the 90's did NOT account for the Century... it was always assumed "19". Dates were typically stored and compared as a six digit number (YYMMDD, YY=Year, MM=Month & DD=Day). So, when the machine looked at a date all it would see is [19]991231 (the "19" was 'invisible'). On the night of Y2K, the machine then thinks the date is 000101. The problem arises when the machine compares dates. The year 991231 is Greater than 000101, however, if you put this compare in proper context using the Century, 19991231 is Less than 20000101. As mentioned above, many applications (including the OS and COBOL/PL-I compilers) didn't accommodate Eight digit (CCYYMMDD) dates. All these date compares and file structures had to be changed to handle Eight digit dates.
We had Billing systems, Registration and Scheduling systems, Lab-Radiology-Pharmacy systems that were all loaded with six digit date logic. If left unchanged, these would have posed very serious issues.
BTW - New to the forum and I apologize if this seems confrontational or petty... but this issue is a pet peeve of mine and I mean for this to be educational.