Nintendo Wii: Sold like gangbusters.
64bit Processors: The computing standard.
Battlestar Galactica: Considered one of the greatest sci-fi series of all time.
Facebook: Continues to be the world’s leading social media platform by literally BILLIONS of users.
High Definition: HD only got even more HD.
iPhone: Set the standard for mobile smartphone form factor and function to this day 16 years later.
Well to be fair, changes like switching to 64 bit always are very slow (especially if they’re not being forced by completely blocking 32 bit). But I don’t think it was overhyped, it just takes time but more RAM was definitely needed to achieve the kinds of games/apps we have now.
Well by 2008 we’d had consumer-grade 64-bit CPUs for 5 years and technically had had 64-bit Windows for 3, but it was a huge mess. There was little upside to using 64-bit Windows in 2008 and 64-bit computing had been hyped up pretty hard for years. You can easily see how one might think that it’s not worth the effort in the personal computer space.
I feel like it finally reached a turning point in 2009 and became useful in the early to mid 2010s. 2009 gave us the first GOOD 64-bit Windows version with mass adoption, and in the 2010s we started getting 64-bit software (2010 for Photoshop, 2014 for Chrome, 2015 for Firefox).
It was different for Linux and servers in particular of course, where a lot of open source stuff had official 64-bit builds in the early 00s already (2003 for Apache for an example).