Well, this is one way to help AMD and other competitors go after the market of those who like to build their own systems.

Optimistacally this is a first step towards a system-in-package with an optical connection to the motherboard. That was supposed to start appearing in 2015, maybe intel is going to push some boundaries.

You can still swap out BGA chips, it just requires a bit of skill with a heatgun. People do this all the time to fix their Xboxes.

Intel already had a habit of changing socket types much more often than AMD did, meaning that you had to upgrade your motherboard about as often as you had to upgrade your CPU anyway.

Still, motherboards fail more often than CPUs do, so for those times when your motherboard fails and you have to buy a whole new CPU/mobo combo, this is a bad thing.

This had to happen anyway.

reasoning: optical interconnects will be mainstream by 2017 so this will allow people to be used to the idea that CPU's and Mobo's are one in the same

biggest impat:
mobo manufacturers -- this cuts out Asus and gigabyte and all the others who make top quality mobo's and in fact Inntel wrote contracts that said information on mobo design could be taken from the companies they partnered with, and no reciprocation made if Intel made mobo's themselves. to mobo makers this was a slap in the face

Hardcore system builders can do what FrankHerbert is suggesting, so the only people this is going to really anger are the wana-be hardcore system builders. And since there are a but-load of those people, I predict a lot of angry users who will claim to switch over to AMD in protest, but then never actually follow through.

If AMD were better at winning converts from the pool of people pissed off with Intel, they wouldn't be having so many financial problems.

As an early adopter, I've only done a CPU swapout once in 43 years and that was 37 years ago. What you can gain from a CPU swap is minor compared to a motherboard upgrade, which isn't much more expensive.

Anyway when you upgrade CPU you have to upgrade motherboard if you want to achieve a 2x level improvement, and if you don't, it is a waste of time/money to upgrade. So basically it is a dont care for DIY system assemblers. They will just buy the CPU/mobo combo which they in many case do already (since there are cheap deals for them), except that now CPU is already pre-soldered (less work).
As for mobo makers, it does indeed change their business model somewhat since they will have to become distributors for CPUs (which cost twice as much as mobo) and to keep inventory of them. That is somewhat of a bummer, and I suspect it might kill this concept once it is attempted in actual market place.

I disagree as I had a proportionate speed improvement upgrading 1.8ghz celerons to 2.4ghz Prescott CPUs in the same motherboard. Multiply that by a hundred motherboards and it's big money. Also not all motherboards are equal. I used the pizza box case which dictated a (then) small motherboard 9x9. Bundling the CPU would certainly have constrained my performance options far more than something I arranged. Moreover many Chinese boards have cheap fuses which blow out at low temperatures. I had a Dell desktop fail last summer simply because the air conditioner was off that day. With Intel's meddling I would be out a CPU as well.

As someone who repairs and upgrades computers as part of the job I can say that I don't see this causing much change. If it appears to be a problem with either the motherboard or processor we usually replace both (unless it's an obviously defective capacitor, which is quick and easy to replace), because while keeping spare RAM, HDDs, power supplies, and other components that are verified good on hand is viable, having spare motherboards and processors that we can swap out to determine the source with any given system build is just not time, space and cost effective.

...having spare motherboards and processors that we can swap out to determine the source with any given system build is just not time, space and cost effective.
It's VERY time and cost effective if you have a server farm. Intel's approach is simply Apple's: Limit consumer choice by bundling to raise prices force consumers into substandard compromises.

Since Intel doesn't see huge performance gains ahead, they're saying "fuck it" and stamping out candy-ass prefab units to the unwashed masses. At that rate, soon they will be gluing the RAM on the board like Apple. Why not the drive as well? The gaming and overclocking community can go with AMD

For a server farm or for large companies that invest in large numbers of computers all at the same time, yes, I agree somewhat. I suppose I should qualify that my position is repairing computers for various small business that buy new machines as they get new employees, because bulk purchasing for all employees at once is not in their budget. Saying that Intel is going to start doing it with other components may be stretching it, though.
Performance gains to cost of upgrading RAM is too high, and Intel knows that. I'd say they've at least put some effort into determining that this is a viable step. Someone running a large server farm will have the money to have a spare board with processor on hand to swap a bad one out quickly (we did that anyway at a server farm I worked at previously), and I can tell you from experience that it's cheaper for a small business to just replace both instead of paying for additional diagnostic time, as much as I hate throwing away possibly good parts.

Joe Schmoe won't care. He purchases his PC from Walmart.

For him, this will lower his purchase price because of of them fancy, schmancy, clampy, wampy, CPU sockets won't be needed.

Back to the days of slotted processors?

"Back to the days of slotted processors?" - Rwinners

Why not? Compared to the CPU all of the peripheral components are as slow as a flowing glacier.

Actually, MBs are becoming so small that this is not likely an issue.... as long as Intel provides adequate I/O for all the accessories... and the ability to accept third party video processors.

So couple this with on chip graphics and you have one size fits all computers (read as: tablets that sit in tower cases). I'm guessing the next step is just as another user said, soldering the ram chips onto the board as well. Pretty soon after that they will determine how much storage space the "average person" needs and solder that on as well.

Isn't it awesome to have a company make all your decisions for you? Oh by the way if you do any after market modifications you will get sued for hardware tampering.

REMEMBER: Reverse engineering is a CRIME!

I dont see this as a problem. The important part is that we still will access to a range of different performing computers.

You dont see a lot of people complaining about their graphics card having every component soldered on the board? So why should the rest of the computer be any different? Actually this means cheaper and more reliable PCs.

These days the improvements you can do to a good "off the shelf gaming computer" is quite limited, and even less so in the future.

As someone who has always built their PCs from scratch, this is a bad thing. I usually pick the parts I want and am not limited to cheap components which degrade the whole system. Once I made the mistake of buying a per-assembled PC, and it was nothing but headaches (for me). And it cost me much more than if I had built it myself.

For those who want a per-assembled PC, this is a good thing, it "should" mean cheaper prices.

The best solution, IMO, is for INTEL to offer a choice of either buying the CPU/Motherboard OR the CPU alone.

Motherboards fail at a much higher rate than CPUs. I've had a CPU go through three motherboards based upon space, video card slot and component failure. PCI/AGP has gone through at least five evolutions now. Among other failures are cheapo capacitors with water in them and the blown fuses on a hot summer day. I've had motherboards catch on fire. Some motherboards are dual CPU and I want to launch a customer with one and allow them an easy upgrade later. Intel is gluing their CPU to a known point of failure because they expect the entire unit to be trashed 90% of the time. Right now there's a lively CPU salvage market on ebay and molotok.ru that Intel wants eliminated.

1. If your motherboard or cpu dies, you are forced to buy both since the removal of a bga would cost more than the bga itself or vise versa.
2. Since this will force the mobo assembly to include the cpu, there will be less combinations available and less options for people who build systems.
3. This will force mobo companies to purchase the process prior to selling the mobo. This introduces a huge liability and cost to the mobo industry which most likely will kill variety and small companies.
4. For people who start out with cheap cpu or mobo and then upgrade later, this is not an option any longer and would push these purchases further out in time or reduce the investment amount.
5. AMD and Intel combinations on mobos will force mobo companies to make some hard decisions on what to invest and how much to build. Again, this is a huge economical effect.

Bottom line, the market will speak and show that this is not a good business decision. Intel is trying to monopolize the market share

Maybe it's time for ATI/AMD to fork from the Intel instruction set?

At first I thought this was ridiculous, but after thinking on it some the biggest complaint I can see really is expandability since many features now reside or are in the process of residing on the CPU itself anyways. If this offers better bandwidth and perhaps better latency as well then I'd say go for it.

Everyone who is freaking out, consider the following. Intel graphics engineers have, like everyone else, full access to the transistor counts, clocks, voltages, and benchmarks of their competitors' products. ....and they're Intel engineers. DDR3 in its current form has a tenth the bandwidth of GDDR5 as commonly implemented. Intel has to do something drastic to get to discrete graphics level bandwidth, like reducing line distance to RAM by, say, half. I expect this to be part of that.

In electronic systems, more than half of all failures are solder joint failures. Ball grid arrays are fragile. The solder joints fail due to repeated thermal expansion and contraction.

Failure of solder joints is an extremely common problem for graphics cards, XBOX 360 and PS3. Now you're going to have to deal with them on CPUs as well.

Everyone who is freaking out, consider the following[...]


I'm already pissed at intel for not selling CPUs without bundled integrated shit-graphics. I'm not going to be any less irritated making further sacrifices for the sake of marginal improvements in integrated graphics performance.

Welcome to the Free Market, where all of your wishes instantly materialize and you are never constrained by the marketing whims of Corporations.

"I'm not going to be any less irritated making further sacrifices for the sake of marginal improvements in integrated graphics performance."

Well, this is one way to help AMD and other competitors go after the market of those who like to build their own systems.

Which is a (very small) niche market.

The CPU already takes a back seat to the GPU when it comes to what you should upgrade if you want to have a faster (gaming) system.

The integrated GPU is crap. Maybe the GPU ( nvidia ) will finally put the processor to rest and this is the spark that was needed. Greed will lead every company to greatness.