Discussion A possible point in future where Intel will no longer make Core i3.

Amol S.

Platinum Member
Mar 14, 2015
2,390
709
136
Hardware demand and acceptability is based on two factors, space and time. Today space is no longer an issue, there are already single hard drives that have 6TB or even 10TB in capacity. The issue in modern world has turned more towards time and computing power. Most modern day CPU's can handle the future, however each CPU has its own life cycle. The core i3 is the lowest of the low, in modern CPU lines. Before the introduction of 7th and 8th gen Intel Core processors, Intel Pentium and Atom were the lowest of the low. Now none of those two processors are ever found in any new PC or laptop. Core i3 it self is already known to have many issues with its implementation for the very demanding Windows 10 OS. Many devices with Core i3 do not last long on average, usually only a year and a half max before newer software takes a toll on the CPU's much lower processing power. This is not the case for the i5 and i7 with their superior processing power. In the modern world, there is very little use for the Core i3, a majority of the public use Core i5 or i7.

Many laptop manufacturers rarely if ever for, non-workstation or non-professional laptops, mix a i3 and discrete NVIDIA or AMD GPU together on the same device, as it diminishes the purpose of there being a Core i3 in the first place. Core i3 purpose is mostly for lower power consumption and non-demanding workload, which is opposite for what NVIDIA and AMD discrete graphics are meant for. The problem is that many everyday applications are now moving towards recommending their software to be used on discrete GPU's rather than integrated GPU's. One such example of this is Firefox where they intend to place NVIDIA as a default renderer on devices that support it. <---Link is to the Firefox Beta release notes for version 67beta.

The use of only integrated graphics actually posses, a greater security risk to the user, compared to software that runs on discrete GPU. The reason being is that a majority, if not all, System UI or Graphical instances run mostly on Integrated GPU only. Any holes in the security of the Integrated GPU software, and a nefarious software or plugins in the software running on the integrated GPU could wrack havoc on the system process running on the Integrated GPU. Basically a discrete GPU is used recently by software developers as a security layer to prevent harm to the users system from any software plugins or software, that try to leverage any security vulnerabilities in a integrated GPU.

Another issue with the Core i3 is the high variety of CPU branding, as well as the competition with AMD. With the release of Intel Core i9, now there are 4 lines of modern Intel CPU's. It's a well known fact that manufacturers do not use all CPU's that Intel is currently manufacturing. Then there is the AMD CPU's that are crowding the space. Basically with this crowded space, there is little room for Core i3 to thrive.

Core i3 is currently a x86 or x64 architecture, thus unsuitable for use in low-end electronics like smartphones, that require ARM based CPU architecture. It would require Intel in investing to make ARM based Core i3 CPU's.
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
The current Coffee Lake Core i3 is more then capable of running modern software and most people don't need more then four cores anyway even on Windows 10. Many people are even using the older 2c/4t i3's just fine as well.

And Intel isn't likely to produce any ARM based CPUs, why would they?
 

CHADBOGA

Platinum Member
Mar 31, 2009
2,135
832
136
The i3 can be any kind of CPU Intel wants it to be, whether that is 2 core, 4 core, 6 core, turbo or no turbo.

It is simply a marketing descriptor, so there is no reason for Intel to stop using it, rather if it is lacking, they can adjust the CPU's that get the i3 branding.
 

VirtualLarry

No Lifer
Aug 25, 2001
56,327
10,034
126
The use of only integrated graphics actually posses, a greater security risk to the user, compared to software that runs on discrete GPU. The reason being is that a majority, if not all, System UI or Graphical instances run mostly on Integrated GPU only. Any holes in the security of the Integrated GPU software, and a nefarious software or plugins in the software running on the integrated GPU could wrack havoc on the system process running on the Integrated GPU. Basically a discrete GPU is used recently by software developers as a security layer to prevent harm to the users system from any software plugins or software, that try to leverage any security vulnerabilities in a integrated GPU.
Cite please? I have never heard this.
 

Amol S.

Platinum Member
Mar 14, 2015
2,390
709
136
Neither have I, and I'm wondering if he is just making this up and/or only reading a tech site that is well known for spreading rumors and has mostly useless articles.
Its actually theorised as NO ONE HAS LOOKED FOR VULNERABILITIES IN INTEGRATED GPUs. For example, ACM states that "While integrated GPUs also use IOMMUs for safety, we leave for future work the dangers and mitigations for integrated GPU security. " (Zhiting Zhu, Sangman Kim, Yuri Rozhanski, Yige Hu, Emmett Witchel, and Mark Silberstein. 2017. Understanding The Security of Discrete GPUs. In Proceedings of the General Purpose GPUs (GPGPU-10). ACM, New York, NY, USA, 1-11. DOI: https://doi.org/10.1145/3038228.3038233). Basically meaning that Integrated GPU's can have the same or similar vulnerabilities. According to another journal article, many tests have been done to descrete and integrated GPU's, out of the tests done there have been simalar issues of vulnerabilities being found in descrete and integrated GPU's. However according to a graph in this other journal article it shows that not all aspects of integrated graphics have been tested for vulnerabilities. (Mittal, Sparsh, et al. “A Survey of Techniques for Improving Security of GPUs.” Journal of Hardware and Systems Security, vol. 2, no. 3, 2018, pp. 266–285., doi:10.1007/s41635-018-0039-0. https://arxiv.org/pdf/1804.00114v1.pdf) This table is in section 5.3.

The latter journal article from just inferencing from the results on that table, show that integrated GPU's are either AS or are LESS secure than discrete GPU's, but are certainly not more secure than descrete GPU's. The latter portion of the previous statement holds true, regardless of any future outcome of any tests showing it to not have the same vulnerabilies. The reason being is that, one can not just hope that the future tests come out to show that they are safe. What if they are not?

On top of which, Intel has already starters to move its CPU security modules to its integrated GPU's. <---Link to article that describes this. This would make the integrated GPU's a more prime target to attacks, as leveraging that security would mean that there would be no CPU or RAM security scanning taking place.

On a side, note small factor micro architectural mobile integrated GPUs, have been proven to accelerate attacks in devices that run in this architecture. (Frigo, Pietro, et al. “Grand Pwning Unit: Accelerating Microarchitectural Attacks with the GPU.” 2018 IEEE Symposium on Security and Privacy (SP), 2018, doi:10.1109/sp.2018.00022. )


But in the end..... if I am right or wrong..... its good bye to integrated grpahics. Intel is removing its Integrated GPU in its future 9th generation CPUs. Multiple article links for this:

https://www.digitaltrends.com/compu...rocessors-without-integrated-graphics-leaked/ <----Article has a link to and cties toms hardware, a subgroup of AnandTech's parent company

https://www.tomshardware.com/news/i9-9900kf-i7-9700kf-i5-9600kf-i5-9400f-prices,38284.html

https://www.pcmag.com/roundup/366303/the-best-cpus <---One of the CPU's it is describing as without integrated graphics.




 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
Removing the iGPU from the i9 CPUs actually makes sense since nearly everyone using that platform is also using a dGPU with it.

For i7 and lower? I serious doubt this as doing so will just piss off OEMs big time, only causing them to switch over to AMD for replacing those CPUs instead.
 

DrMrLordX

Lifer
Apr 27, 2000
21,620
10,829
136
Intel is removing its Integrated GPU in its future 9th generation CPUs.

Intel will continue producing numerous iGPU-equipped products. Their entire IceLake U/Y lineup coming later this year will feature iGPUs based on Gen11. This is well-known and documented.
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
Intel will continue producing numerous iGPU-equipped products. Their entire IceLake U/Y lineup coming later this year will feature iGPUs based on Gen11. This is well-known and documented.
Aside from making room for more Cores and/or cache, I don't see Intel removing the iGPU from its consumer level processors.
 

DrMrLordX

Lifer
Apr 27, 2000
21,620
10,829
136
Aside from making room for more Cores and/or cache, I don't see Intel removing the iGPU from its consumer level processors.

Indeed. The oddball 9th gen CPUs that have no iGPUs are just that . . . oddballs. If 10c Comet Lake ever shows up, I suspect it'll come without an iGPU as well.
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
Indeed. The oddball 9th gen CPUs that have no iGPUs are just that . . . oddballs. If 10c Comet Lake ever shows up, I suspect it'll come without an iGPU as well.
I don't know where the OP gets the idea that Intel will drop the i3 or remove the iGPU from it's consumer lineup aside from i9's and even that will be limited to the Extreme Editions only.

As far as I can tell 4c/4t will be still quite usable for a long time to come. Hell I no plans at all of replacing my Haswell i5-4670 system anytime soon.
 

TheELF

Diamond Member
Dec 22, 2012
3,973
730
126
It's 2019 and we still have celerons...
Anybody having any trouble running windows 10 on any CPU just has no idea on how to operate a PC,windows 10 is pretty light.
 

Genx87

Lifer
Apr 8, 2002
41,095
513
126
The i3 can be any kind of CPU Intel wants it to be, whether that is 2 core, 4 core, 6 core, turbo or no turbo.

It is simply a marketing descriptor, so there is no reason for Intel to stop using it, rather if it is lacking, they can adjust the CPU's that get the i3 branding.

Exactly this. i3 is a brand that represents the lowest ring of their "core" product line. It fluid. When i5s are 8 core the i3 can be 4-6 core. When the i5 moves to 12 core the i3 can be 8 core ect. And as already noted, the Celeron is still being sold. It has grown from being a single core low end device without L2 cache(Deceleron) into what it is today.
 

mopardude87

Diamond Member
Oct 22, 2018
3,348
1,575
96
If its 4 core based and whatever name either a Celeron or a Core i3 its gonna be good enough for something now and even a few years from now i am sure especially basic usage . Gamers are typically going to stick to i5/i7 chips and currently from my experience the latest i7 8700 is all you will prob need.

If mobile devices become bigger or practical perhaps there is a future where the only people with desktops are gamers/servers or hardcore users and perhaps anything but a core i5 will become a thing of the past? When you think about it having a dedicated space for a desk with a 20'' + screen and a big tower keyboard and mouse is kind of silly to watch Youtube/netflix/hulu or check email and go on Facebook when you could have a 65'' 4k with a remote watching from the comfort of your couch. We will prob laugh at people having a 5x 4 space space for a computer for anything but gaming or serious work in a few years given how far mobile technology has come.

Kind of surprised anyone buys a dedicated tower for anything but gaming, i would be amazed if a $200 tablet can't hook up to a t.v these days and do anything not gaming or rendering related. Maybe next generation consoles could make towers obsolete for the mass majority? I keep hearing about keyboard and mouse support so why not?
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
I for one like having a decent sized display. And you can pry my tower out of my dead cold hands.
 

ehume

Golden Member
Nov 6, 2009
1,511
73
91
I tend to agree. Heck, I don't even have a dGPU on my i7 4770k. My daughter has one on her i7 8700k, but I don't need one.
 

scannall

Golden Member
Jan 1, 2012
1,946
1,638
136
i3, i7, Celeron or pink petunias. It's all marketing names. Intel will make products, and stick whatever label on them they want and feel will sell the most units.
 

whm1974

Diamond Member
Jul 24, 2016
9,460
1,570
96
i3, i7, Celeron or pink petunias. It's all marketing names. Intel will make products, and stick whatever label on them they want and feel will sell the most units.
I'm wondering if Intel will make the Pentiums quad core sometime in the future? Consider that that the AMD R3 2200G is available at the same price, and has four cores and a much better iGPU than what the Core line does.
 

Shmee

Memory & Storage, Graphics Cards Mod Elite Member
Super Moderator
Sep 13, 2008
7,400
2,437
146
I think that the i3 tilte is fine, but at this point, it would serve well to get rid of Pentiums and Celerons, as though are always bottom bin and usually have confusing numbers. At very least drop the Celerons. It would make sense to have 4c 4t i3, 6c 6t i5, and then the i7 at 8c 8t, and everything above that being i9. I suppose if they have a lot of failed quads they have to salvage they could go with a 2c Pentium with HT.
 

TheELF

Diamond Member
Dec 22, 2012
3,973
730
126
What are you talking about,pentiums and celerons are the bread and butter of the CPU industry for intel I wouldn't be surprised if they would make more money out of them than from server sales.

Sure at some point celerons are gonna get HT and pentiums are going to be 4c,just as the single core CPU got fazed out so will the plain dual core but intel doesn't have to do that for quite some years,they might but they don't have to.