i7 just marked their top of the line consumer products until they introduced the i9 in 2017. First models were introduced 2008, but I think the mobile versions came in 2010.
It could be as old as 15 years… If someone bought a species out i7 laptop in 2009 they may have upgraded it to 16gb at some point. Seems realistic enough
Please stop. I’m only in my 30s but you’re making me feel like I’m 80. To me, old is a 386 with 4MB of RAM, a 40MB hard drive, Windows 3.1, and a turbo button. Audio was limited to a single channel square wave courtesy of the PC speaker, cause sound cards were expensive.
Or if you want to really talk old in the personal computing realm, then we’ll have to start bring up companies like Commodore, Atari, and Radio Shack. But their computers were before my time.
Well personal computing just moved faster back then. Today, a decent computer from 10 years ago (2014) is perfectly usable for most people (with an SSD especially). But in 2010 if you had a top of the line computer that was from 2000 it was basically garbage. If you had a computer from 1990 in the year 2000 it was practically ancient history.
The PC market just has plateaud for everyday use. We just see incremental performance improvements for enthusiasts/professionals and little more than power draw improvements for everyone else.
We just see incremental performance improvements for enthusiasts/professionals and little more than power draw improvements for everyone else.
For several years we didn’t even see those. When AMD wasn’t competitive, Intel didn’t do shit to improve their performance. Between like Sandy Bridge (2011) and Kaby Lake (2016) you’d get so little performance uplift, there wasn’t any point in upgrading, really. Coffee Lake for desktop (2017) and Whiskey Lake for laptops (2018) is when they actually started doing… anything, really.
Now we at least get said incremental performance improvements again, but they’re not worth upgrading CPUs for any more often than like 5 or more years on desktop IMO. You get way more from a graphics card upgrade and if you’re not pushing 1080p at max fps, the improvements from a new CPU will be pretty hard to feel.
My i7 Thinkpad is a dual core and pretty trash. Can’t even play YouTube videos without forcing H264 and even then it’s better to use FreeTube. Sounds about on par with a Raspberry Pi
I have an Asus ROG laptop I bought in 2013 with a 3rd gen i7, whatever the gtx 660 mobile chip was and 16gb of ram, it’s definitely old by any definition, but swapping for an ssd makes it super useable, it’s the machine that lives in my garage as a shop/lab computer. To be fair, its job is web browsing, CAD touchups, slicing and PDF viewing most of the time, but I bet I could be more demanding on it.
I had been running mint w/ cinnamon on it before as I was concerned about resource usage, was a klipper and octoprint host to printer for a year and a bit. Wiped it and went for Debian with xfce becauae again, was originally concerned about resource usage but ended up swapping to KDE and I don’t notice any difference so it’s staying that way.
I really hate waste so I appreciate just how useable older hardware can be, Yeah there’s probably an era that’s less true but I’ll go out on a limb (based on feeling only) and suggest that anything in the last 15 years this’ll be true for, but that’s going to depend on what you’re trying to do with it, you won’t have all the capability of more modern hardware but frankly a lot of use cases probably don’t need that anyhow (web browsing, word processing, programming, music playback for sure, probably some video playback, pretty much haven’t hit a wall yet with my laptop)
I have a ten-year old MacBook Pro with an i7 and 16gb of ram. Just because this thing was a total beast when it was new does not mean it isn’t old now. works great with Ubuntu though. It’s still not a good idea to run it as a server though. My raspberry pi consumes a lot less energy for some basic web hosting tasks. I only use the old MBP to run memory intense docker containers like openrouteservice and I guess just using some hosting service for that would not be much more expensive.
deleted by creator
Intel has been on the i3, i5, i7 naming scheme for a while though. I think the oldest ones are probably ~15 years old at this point.
i7 just marked their top of the line consumer products until they introduced the i9 in 2017. First models were introduced 2008, but I think the mobile versions came in 2010.
So yeah 15 years is pretty close.
Nah there were mobile i7s released in September 2009 (though how long they took to ship in actual hardware, dunno.)
https://www.cpu-world.com/CPUs/Core_i7/Intel-Core i7 Mobile Extreme Edition I7-920XM BY80607002529AF.html
Ah, good find, I just skimmed Ark and didn’t see anything before Q1 '10.
The 2700K i7 came out 2011, were there any i7 before that?
Edit: yes there were. Like the 800 series.
13 years old i7-2600 still going strong here.
I wonder how long it takes to buy a say thinkcentre m710 (<100€) with the electricity cost difference. IIRC the 2500-2600 were quite resource hungry.
Yeah I had the i7 7700k which was like 7 years ago, and with like 64GB of ram because I wanted to play with large ramdisks.
Yeah, my 2011 Macbook Pro has an i7. In computing terms, 13 years is an eternity.
But yeah, it’s also got 16gb RAM and a 500gb SSD and runs Mint like a dream.
I figure his username is his birth year
It could be as old as 15 years… If someone bought a species out i7 laptop in 2009 they may have upgraded it to 16gb at some point. Seems realistic enough
Please stop. I’m only in my 30s but you’re making me feel like I’m 80. To me, old is a 386 with 4MB of RAM, a 40MB hard drive, Windows 3.1, and a turbo button. Audio was limited to a single channel square wave courtesy of the PC speaker, cause sound cards were expensive.
Or if you want to really talk old in the personal computing realm, then we’ll have to start bring up companies like Commodore, Atari, and Radio Shack. But their computers were before my time.
Well personal computing just moved faster back then. Today, a decent computer from 10 years ago (2014) is perfectly usable for most people (with an SSD especially). But in 2010 if you had a top of the line computer that was from 2000 it was basically garbage. If you had a computer from 1990 in the year 2000 it was practically ancient history.
The PC market just has plateaud for everyday use. We just see incremental performance improvements for enthusiasts/professionals and little more than power draw improvements for everyone else.
For several years we didn’t even see those. When AMD wasn’t competitive, Intel didn’t do shit to improve their performance. Between like Sandy Bridge (2011) and Kaby Lake (2016) you’d get so little performance uplift, there wasn’t any point in upgrading, really. Coffee Lake for desktop (2017) and Whiskey Lake for laptops (2018) is when they actually started doing… anything, really.
Now we at least get said incremental performance improvements again, but they’re not worth upgrading CPUs for any more often than like 5 or more years on desktop IMO. You get way more from a graphics card upgrade and if you’re not pushing 1080p at max fps, the improvements from a new CPU will be pretty hard to feel.
8 gb ddr3 dimms do exist. It could be a decade old laptop that can do that
The first i7 came out like 15 years ago now. i7 came out before i5 or i3 as well.
My i7 Thinkpad is a dual core and pretty trash. Can’t even play YouTube videos without forcing H264 and even then it’s better to use FreeTube. Sounds about on par with a Raspberry Pi
I have an Asus ROG laptop I bought in 2013 with a 3rd gen i7, whatever the gtx 660 mobile chip was and 16gb of ram, it’s definitely old by any definition, but swapping for an ssd makes it super useable, it’s the machine that lives in my garage as a shop/lab computer. To be fair, its job is web browsing, CAD touchups, slicing and PDF viewing most of the time, but I bet I could be more demanding on it.
I had been running mint w/ cinnamon on it before as I was concerned about resource usage, was a klipper and octoprint host to printer for a year and a bit. Wiped it and went for Debian with xfce becauae again, was originally concerned about resource usage but ended up swapping to KDE and I don’t notice any difference so it’s staying that way.
I really hate waste so I appreciate just how useable older hardware can be, Yeah there’s probably an era that’s less true but I’ll go out on a limb (based on feeling only) and suggest that anything in the last 15 years this’ll be true for, but that’s going to depend on what you’re trying to do with it, you won’t have all the capability of more modern hardware but frankly a lot of use cases probably don’t need that anyhow (web browsing, word processing, programming, music playback for sure, probably some video playback, pretty much haven’t hit a wall yet with my laptop)
I have a ten-year old MacBook Pro with an i7 and 16gb of ram. Just because this thing was a total beast when it was new does not mean it isn’t old now. works great with Ubuntu though. It’s still not a good idea to run it as a server though. My raspberry pi consumes a lot less energy for some basic web hosting tasks. I only use the old MBP to run memory intense docker containers like openrouteservice and I guess just using some hosting service for that would not be much more expensive.
I manually upgraded a 3rd gen i7 (2012) machine to 32GB in 2016. Doesn’t make that laptop ant less old tho.
When I got a deal on my i7-3770k, I actually had enough to get more ram. So that desktop has 16 gigs.
Still going strong since 2013. It’s an emulation rig now.