Becoming an early adopter of new operating system versions, new IDEs, new software applications, even new games... there are still some people who think it's cool, but really it is becoming a disease.
Microsoft brought it to us: Each new version of Windows was theorically backwards compatible, but how many existing applications and games worked from the first day? Windows 95, 98, 2000, XP, Vista, 7... All of them were either compatible or had "compatibility settings", but the truth is that few applications (and fewer games) run without needing patches or service packs.
XBox? First console to have "game updates" (patches, as they were called in PC), which can also be read as "first console to have unfinished and buggy games". I don't remember a single PS2 title freezing, but I've already "killed" one XBox 360 because of the RRoD, and found many games that needed "title updates" to not hang.
But Apple is also confiming this theory: You're not cool to have the latest iPhone, you also have to install the latest iOS, and update sometimes weekly your applications, and live with the new bugs induced (or even changes on how buttons behave, as in the iPad and the 4.2 update's lock-screen-to-mute change).
There was a time when things used to work; a time when installing the 1.1 version of something meant a huge speed increase in that awesome game, or new features on that application. A time when even "beta testing" was paid, and not almost mandatory for everyone as it is becoming now, whenever it is sold as "be cool and test new features" or just "wanna do X? wait some months or try this unstable version which allows you to do it".
Now, it works and spreads like a disease: Some get it by accident, others by "contact" (with cool people yo induce them the concept that if you don't install the latest beta versions you're not geeky enough), at first looks as nothing but it can be fatal (losing iPhone data, your videogame savegames, corrupt files and newly incompatible formats).
Consoles constantly updating, games and applications constantly prompting or forcing to update, OS with Service Packs that you can test months before (and usually end up leaving gigabytes of beta trash you cannot get rid of)... but you know what? you don't need the latest version of anything.
Never install something "0-day" if it replaces an existing version. Replicate the system, or do a fresh install, but never overriding the existing software at least until "matures" and the bugs (that believe me, always appear) get properly fixed.
Waiting a few weeks until all major companies had non-beta Windows 7 drivers allowed my gaming PC to work almost flawlessly. Apple released a service pack of
the service pack Snow Leopard few days after fixing newly introduced bugs. Vista's first steps were terrible because of the lack of backwards compatibility... I could give a hundred examples without any effort.
iOS 4.2 changed a button on the iPad? Stay with 3.2, for games you will only miss the Game Center but most games will work. Windows XP works for your gaming needs and you don't really need something else? Don't switch to Windows 7, will go slower and eat more RAM, plus DirectX 10 is way slower than DirectX9 so you have more to lose than to win. Wait until your next computer and then switch to Win7 with better hardware.
Learning something new is great, getting blind regarding to only using new things "because they're newer" even if they fail is as bad as getting stuck on an obsolete technology.
Inmature technologies can be cool but usually come with tradeoffs, whenever they are worse performance, unstability or usually just a bunch of new bugs and problems with software that was working correctly before.
One final note: This doesn't mean you shouldn't try new things: Of course you should, just carefully and not risking losing your data. Sadly software gets more and more complex each day and you'll be "playing" with your personal data without warranties of any kind (check the terms of service, license agreement or whatever comes).