IT

If Remote Work Lasts Two Years, Will Employees Ever Return to Offices? (livemint.com) 230

"With the latest wave of return-to-office delays from Covid-19, some companies are considering a new possibility: Offices may be closed for nearly two years," reports the Wall Street Journal.

"That is raising concerns among executives that the longer people stay at home, the harder or more disruptive it could be to eventually bring them back." Many employees developed new routines during the pandemic, swapping commuting for exercise or blocking hours for uninterrupted work. Even staffers who once bristled at doing their jobs outside of an office have come to embrace the flexibility and productivity of at-home life over the past 18 months, many say. Surveys have shown that enthusiasm for remote work has only increased as the pandemic has stretched on. "If you have a little blip, people go back to the old way. Well, this ain't a blip," said Pat Gelsinger, chief executive officer of Intel Corp., whose company has benefited from the work-from-home boom. He predicts hybrid and remote work will remain the norm for months and years to come. "There is no going back...."

[W]hat many have concluded over time is that their companies can operate largely effectively while remote, executives and workers say... As more time passes until offices reopen, it could become difficult to convince existing employees to willingly upend their new lives and return to pre-pandemic schedules in offices, executives say.

Apple, Amazon, Facebook, and Lyft have now all postponed the return to their U.S. workplaces until 2022.
Intel

Intel Previews Its Alder Lake Chip, Promises Hybrid CPUs for Desktops and Laptops (theverge.com) 36

Intel has spent much of 2021 announcing plans for its future: a new IDM 2.0 strategy, new naming schemes for its process nodes, and new desktop GPUs. At Intel's Architecture Day 2021, we finally got a preview of how some of those changes are coming together in new chips, starting with the upcoming Alder Lake lineup later this year. From a report: As the company has been teasing since last year's Architecture Day, Alder Lake will feature Intel's latest hybrid architecture: instead of simply offering the next generation of powerful Intel CPU cores, it'll offer a mix of both performance and efficiency x86 cores, both of which Intel previewed as part of its announcements. Additionally, Alder Lake will be the first chip released on Intel's newly renamed Intel 7 technology node (not to be confused with Intel 4, which was previously known as Intel's delayed 7nm node, and will be available to consumers sometime in 2023 under the codename "Meteor Lake"). Intel 7 still uses similar technology to the company's current 10nm tech, instead of the bigger leap in manufacturing processes planned for Intel 4.

The new x86 performance core -- codenamed "Golden Cove" -- is the successor to the Willow Cove cores that are currently found in the company's 11th Gen Tiger Lake processors. Intel claims that it's the most powerful CPU core its ever built, but the company only offered a comparison to its Cypress Cove cores (the version of its 10nm architecture that Intel ported to its 14nm process), not the more advanced Willow Cove cores. Meanwhile, the company's new x86 Efficient core (codenamed "Gracemont") aims to be "the world's most efficient x86 CPU core" while still offering higher IPC than the company's Skylake chips. Intel claims that for single-thread cases, one of its new efficient cores hits 40 percent more performance at the same power (or similar performance while using 40 percent of the power) of a Skylake core, improvements that double when comparing four Efficient cores running four threads to two Skylake cores running four threads.

Intel

Intel Is Giving Up On Its AI-Powered RealSense Cameras (engadget.com) 16

In a statement to CRN, Intel said it was "winding down" RealSense and transferring the talent and computer vision tech to efforts that "better support" its core chip businesses. Engadget reports: Questions surfaced about the fate of RealSense after the team's leader, Sagi Ben Moshe, said he was leaving Intel two weeks ago. RealSense aimed to make computer vision more flexible and accessible. A company or researcher could buy cameras to aid everything from robot navigation through to facial recognition, and there was even a developer-focused phone. It was never a truly mainstream product, though, and ASI VP Kent Tibbils told CRN that there were few customers buying RealSense cameras in any significant quantities. It wasn't really a money-making division, even if the work helped Intel's other teams.

For Intel, there's likely a simpler answer: it wants to cut ballast. CEO Pat Gelsinger wants Intel to reclaim the chipmaking crown, and that means concentrating its resources on design and manufacturing capabilities. No matter how successful RealSense is, it's a potential distraction from Intel's latest strategy.

Portables (Apple)

Apple Planning Multiple Events For the Fall, M1X MacBook Pros To Be Available By November (macrumors.com) 55

An anonymous reader quotes a report from MacRumors: Apple is planning to hold multiple events this fall, which will collectively include the launch of new iPhones, Apple Watches, updated AirPods, revamped iPad mini, and the redesigned MacBook Pros, according to respected Bloomberg journalist Mark Gurman. In his latest weekly Power On newsletter, Gurman says that much like last year, Apple will hold multiple events this coming fall, with the first likely being in September for the iPhone 13. Last year, due to the global health crisis and production constraints, the iPhone 12 lineup was not announced until October. The 2020 September event, rather than focusing on new iPhones, showcased new Apple Watches, iPads, and services.

This year, Apple is expected to return to its tradition of announcing its flagship yearly iPhone update in September, according to multiple reports. In today's newsletter, Gurman reiterated his reporting from earlier last week, setting expectations for the iPhone 13 to include updates to the camera focused towards professional users, more advanced displays, and a smaller notch. Alongside the new iPhones, Gurman, as previously reported, says that Apple can be expected to launch the third-generation AirPods featuring an updated design, an updated iPad mini with a larger display, thinner borders, and improved performance, as well as the Apple Watch Series 7 with flatter and improved displays, and performance.

As for the highly anticipated MacBook Pros featuring mini-LED displays, updated designs, and the M1X Apple silicon chip, Gurman says they will be available by the time the current 16-inch MacBook Pro, powered by Intel, will celebrate its second anniversary. The 16-inch MacBook Pro was last updated in November of 2019. The first event of the fall in September will likely include the new iPhones, Apple Watches, and AirPods, while the new iPads and possible updates to some of the company's services could be reserved for a second event, with the final event of the season being focused on Apple silicon Macs.

Intel

Intel Enters the PC Gaming GPU Battle With Arc 92

Dave Knott writes: Intel is branding its upcoming consumer GPUs as Intel Arc. This new Arc brand will cover both the hardware and software powering Intel's high-end discrete GPUs, as well as multiple hardware generations. The first of those, known previously as DG2, is expected to arrive in the form of codename "Alchemist" in Q1 2022. Intel's Arc GPUs will be capable of mesh shading, variable rate shading, video upscaling, and real-time ray tracing. Most importantly, Intel is also promising AI-accelerated super sampling, which sounds like Intel has its own competitor to Nvidia's Deep Learning Super Sampling (DLSS) technology.
IBM

The IBM PC Turns 40 (theregister.com) 117

The Register's Richard Speed commemorates the 40th anniversary of the introduction of the IBM Model 5150: IBM was famously late to the game when the Model 5150 (or IBM PC) put in an appearance. The likes of Commodore and Apple pretty much dominated the microcomputer world as the 1970s came to a close and the 1980s began. Big Blue, on the other hand, was better known for its sober, business-orientated products and its eyewatering price tags. However, as its customers began eying Apple products, IBM lumbered toward the market, creating a working group that could dispense with the traditional epic lead-times of Big Blue and take a more agile approach. A choice made was to use off-the-shelf hardware and software and adopt an open architecture. A significant choice, as things turned out.

Intel's 8088 was selected over the competition (including IBM's own RISC processor) and famously, Microsoft was tapped to provide PC DOS as well as BASIC that was included in the ROM. So this marks the 40th anniversary of PC DOS, aka MS-DOS, too. You can find Microsoft's old MS-DOS source code here. The basic price for the 5150 was $1,565, with a fully loaded system rising to more than $3,000. Users could enjoy high resolution monochrome text via the MDA card or some low resolution graphics (and vaguely nauseating colors) through a CGA card (which could be installed simultaneously.) RAM landed in 16 or 64kB flavors and could be upgraded to 256kB while the Intel 8088 CPU chugged along at 4.77 MHz.

Storage came courtesy of up to two 5.25" floppy disks, and the ability to attach a cassette recorder -- an option swiftly stripped from later models. There was no hard disk, and adding one presented a problem for users with deep enough pockets: the motherboard and software didn't support it and the power supply was a bit weedy. IBM would resolve this as the PC evolved. Importantly, the motherboard also included slots for expansion, which eventually became known as the Industry Standard Architecture (ISA) bus as the IBM PC clone sector exploded. IBM's approach resulted in an immense market for expansion cards and third party software.
While the Model 5150 "sold like hotcakes," Speed notes that it was eventually discontinued in 1987.
Crime

Samsung Leader Jay Y. Lee Granted Parole, To Leave Prison On Friday (reuters.com) 26

Samsung vice chairman Jay Y. Lee, in jail after convictions for bribery, embezzlement and other charges, has qualified for parole and is expected to leave prison this Friday, South Korea's justice ministry said. Reuters reports: "The decision to grant Samsung Electronics vice chairman Jay Y. Lee parole was the result of a comprehensive review of various factors such as public sentiment and good behavior during detention," the ministry said in a statement on Monday.
Convicted of bribing a friend of former President Park Geun-hye, Lee, 53, has served 18 months of a revised 30 month sentence. He initially served one year of a five-year sentence from August 2017 which was later suspended. That court decision was then overturned and while the sentence was shortened, he was sent back to jail in January this year. Lee still needs the Justice Minister to approve his return to work as the law bars persons with certain convictions from working for companies related to those convictions for five years. He is likely to get that, legal experts say, due to circumstances such as the amount deemed embezzled having been repaid.
The Federation of Korean Industries, a big business lobby, welcomed the decision, adding: "If the investment clock, currently at standstill, is not wound up quickly, we could lag behind global companies such as Intel and TSMC and lose the Korean economy's bread and butter at a moment's notice."
AI

Self-Driving Car Startup Wants to Spare AI From Making Life-or-Death Decisions (washingtonpost.com) 134

Instead of having AI in a self-driving car decide whether to kill its driver or pedestrians, the Washington Post reports there's a new philosophy gaining traction: Why not stop cars from getting in life-or-death situations in the first place? (Alternate URL): After all, the whole point of automated cars is to create road conditions where vehicles are more aware than humans are, and thus better at predicting and preventing accidents. That might avoid some of the rare occurrences where human life hangs in the balance of a split-second decision... The best way to kill or injure people probably isn't a decision you'd like to leave up to your car, or the company manufacturing it, anytime soon. That's the thinking now about advanced AI: It's supposed to prevent the scenarios that lead to crashes, making the choice of who's to die one that the AI should never have to face.

Humans get distracted by texting, while cars don't care what your friends have to say. Humans might miss objects obscured by their vehicle's blind spot. Lidar can pick those things up, and 360 cameras should work even if your eyes get tired. Radar can bounce around from one vehicle to the next, and might spot a car decelerating up ahead faster than a human can... [Serial entrepreneur Barry] Lunn is the founder and CEO of Provizio, an accident-prevention technology company. Provizio's secret sauce is a "five-dimensional" vision system made up of high-end radar, lidar and camera imaging. The company builds an Intel vision processor and Nvidia graphics processor directly onto its in-house radar sensor, enabling cars to run machine-learning algorithms directly on the radar sensor. The result is a stack of perception technology that sees farther and wider, and processes road data faster than traditional autonomy tech, Lunn says. Swift predictive analytics gives vehicles and drivers more time to react to other cars.

The founder has worked in vision technology for nearly a decade and has previously worked with NASA, General Motors and Boeing under the radar company Arralis, which Lunn sold in 2017. The start-up is in talks with big automakers, and its vision has a strong team of trailblazers behind it, including Scott Thayer and Jeff Mishler, developers of early versions of autonomous tech for Google's Waymo and Uber... Lunn thinks the auto industry prematurely pushed autonomy as a solution, long before it was safe or practical to remove human drivers from the equation. He says AI decision-making will play a pivotal role in the future of auto safety, but only after it has been shown to reduce the issues that lead to crashes. The goal is the get the tech inside passenger cars so that the system can learn from human drivers, and understand how they make decisions before allowing the AI to decide what happens in specified instances.

Space

Starlight Could Really Be a Vast Alien Quantum Internet, Physicist Proposes (vice.com) 76

Terry Rudolph, a professor of quantum physics at Imperial College London, suggests that interstellar light could actually be harnessed by space faring aliens to form an encrypted quantum internet. Motherboard reports: This may sound like the stuff of science fiction, but Rudolph says it was actually a natural extension of what he does as co-founder of PsiQuantum, a Silicon Valley-based company on a mission to build a scalable photonic quantum computer. He laid out his idea in a paper recently published on the arXiv preprint server. Rudolph said the idea for the paper on aliens communicating with quantum starlight flowed from his work on quantum computers. Unlike the quantum computers being pursued by the likes of Google or Intel that use superconducting circuits or trapped ions at incredibly cold temperatures to create qubits (the quantum equivalent of a computer bit), photonic computers use light to accomplish the same thing. While Rudolph says this kind of quantum design is unconventional, it does also have advantages over its rival -- including being able to operate at room temperature and easy integration into existing fiber optic infrastructure.

The primary way the aliens would create this kind of quantum internet is through a quantum mechanics principle called entanglement, explains Rudolph. In a nutshell, entanglement is a phenomena in which the quantum states of particles (like photons) are linked together. This is what Einstein referred to as "spooky action at a distance" and means that disturbing one particle will automatically affect its partner, even if they're miles apart. This entanglement would allow aliens -- or even humans -- to send encrypted signals between entangled partners, or nodes. Now, scale that single computer system up to a network potentially spanning the entire cosmos.

Aliens aside, Rudolph says that his paper demonstrates that building a photon-based quantum internet here on Earth might be "much easier than we expected." As for the aliens, even if they were using this kind of technology to transform waves of light into their own personal chat rooms, we'd have no way of knowing, says Rudolph. And even if we could pick out these light patterns in the sky, we still wouldn't be able to listen in. This is due to the incredibly shy nature of quantum particles -- any attempt to observe them by an outside party would alter their state and destroy the information they were carrying.

China

US Intel Agencies Are Reviewing Genetic Data From Wuhan Lab (cnn.com) 145

ytene writes: CNN is claiming an exclusive scoop, with an article reporting that U.S. intelligence agencies have scored a massive trove of Covid-19 genetic data, which, CNN suggests, comes from the Wuhan research lab. More than the complex challenge of absorbing and understanding the "mountain" of raw data, U.S. researchers are going to have to translate the material from native Mandarin before the real work can begin. Whilst there has obviously been a lot of interest in a clear identification of the source, it isn't clear how such a revelation could have a material impact on the efficacy of vaccines or the take-up of the treatment. It might, however, give useful clues to help understand where or how the next deadly outbreak could develop. "It's unclear exactly how or when U.S. intelligence agencies gained access to the information, but the machines involved in creating and processing this kind of genetic data from viruses are typically connected to external cloud-based servers -- leaving open the possibility they were hacked," notes CNN, citing multiple people familiar with the matter.

The report also notes that senior intelligence officials are "genuinely split between the two prevailing theories on the pandemic's origins." The World Health Organization says wildlife farms in southern China are the most likely source of the COVID-19 pandemic, but the theory that the virus accidentally escaped from a lab in Wuhan is still being investigated. According to a CNN report last month, "[S]enior Biden administration officials overseeing the 90-day review now believe the theory that the virus accidentally escaped from a lab in Wuhan is at least as credible as the possibility that it emerged naturally in the wild -- a dramatic shift from a year ago, when Democrats publicly downplayed the so-called lab leak theory."
AMD

AMD Ryzen 5000G Series Launches With Integrated Graphics At Value Price Points (hothardware.com) 69

MojoKid writes: AMD is taking the wraps off of its latest integrated processors known as Ryzen 7 5700G and the Ryzen 5 5600G. As their branding suggests, these new products are based on the same excellent AMD Zen 3 core architecture, but with integrated graphics capabilities on board as well, hence the "G" designation. AMD is targeting more mainstream applications with these chips. The Ryzen 7 5700G is an 8-core/16-thread CPU with 4MB of L2 cache and 16MB of L3. Those CPU cores are mated to an 8 CU (Compute Unit) Radeon Vega graphics engine, and it has 24 lanes of PCIe Gen 3 connectivity. The 5700G's base CPU clock is 3.8GHz, with a maximum boost clock of 4.6GHz. The on-chip GPU can boost up to 2GHz, which is a massive uptick from the 1.4GHz of previous-gen 3000-series APUs.

The Ryzen 5 5600G takes things down a notch with 6 CPU cores (12 threads) and a smaller 3MB L2 cache while L3 cache size remains unchanged. The 5600G's iGPU is scaled down slightly as well with only 7 CUs. At 3.9GHz, the 5600G's base CPU clock is 100MHz higher than the 5700G's, but its max boost lands at 4.4GHz with a slightly lower GPU boost clock of 1.9GHz. In the benchmarks, the Ryzen 5 5600G and Ryzen 7 5700G both offer enough multi-threaded muscle for the vast majority of users, often besting similar Intel 11th Gen Core series chips, with highly competitive single-thread performance as well.

Desktops (Apple)

Mac Pro Gets a Graphics Update (sixcolors.com) 23

On Tuesday, Apple rolled out three new graphics card modules for the Intel-based Mac Pro, all based on AMD's Radeon Pro W6000 series GPU. From a report: (Apple posted a Mac Pro performance white paper [PDF] to celebrate.) The new modules (in Apple's MPX format) come in three variants, with a Radeon Pro W6800X, two W6800X GPUs, and the W6900X. Each module also adds four Thunderbolt 3 ports and an HDMI 2 port to the Mac Pro. The Mac Pro supports two MPX modules, so you could pop in two of the dual-GPU modules to max out performance. They can connect using AMD's Infinity Fabric Link, which can connect up to four GPUs to communicate with one another via a super-fast connection with much more bandwidth than is available via the PCIe bus.
AMD

AMD and Valve Working On New Linux CPU Performance Scaling Design (phoronix.com) 10

Along with other optimizations to benefit the Steam Deck, AMD and Valve have been jointly working on CPU frequency/power scaling improvements to enhance the Steam Play gaming experience on modern AMD platforms running Linux. Phoronix reports: It's no secret that the ACPI CPUFreq driver code has at times been less than ideal on recent AMD processors with delivering less than expected performance/behavior with being slow to ramp up to a higher performance state or otherwise coming up short of disabling the power management functionality outright. AMD hasn't traditionally worked on the Linux CPU frequency scaling code as much as Intel does to their P-State scaling driver and other areas of power management at large. AMD is ramping up efforts in these areas including around the Linux scheduler given their recent hiring spree while it now looks like thanks to the Steam Deck there is renewed interest in better optimizing the CPU frequency scaling under Linux.

AMD and Valve have been working to improve the performance/power efficiency for modern AMD platforms running on Steam Play (Proton / Wine) and have spearheaded "[The ACPI CPUFreq driver] was not very performance/power efficiency for modern AMD platforms...a new CPU performance scaling design for AMD platform which has better performance per watt scaling on such as 3D game like Horizon Zero Dawn with VKD3D-Proton on Steam." AMD will be presenting more about this effort next month at XDC. It's quite possible this new effort is focused on ACPI CPPC support with the previously proposed AMD_CPUFreq. Back when Zen 2 launched in 2019, AMD did post patches for their new CPUFreq driver that leveraged ACPI Collaborative Processor Performance Controls but the driver was never mainlined nor any further iterations of the patches posted. When inquiring about that work a few times since then, AMD has always said it's been basically due to resource constraints that it wasn't a focus at that time. Upstream kernel developers also voiced their preference to seeing AMD work to improve the generic ACPI CPPC CPUFreq driver code rather than having another vendor-specific solution. It's also possible AMD has been working on better improvements around the now-default Schedutil governor for scheduler utilization data in making CPU frequency scaling decisions.

Google

Google Will Abandon Qualcomm and Build Its Own Smartphone Processors This Year (cnbc.com) 57

Google announced Monday it will build its own smartphone processor, called Google Tensor, that will power its new Pixel 6 and Pixel 6 Pro phones this fall. From a report: It's another example of a company building its own chips to create what it felt wasn't possible with those already on the market. In this case, Google is ditching Qualcomm. The move follows Apple, which is using its own processors in its new computers instead of Intel chips. And like Apple, Google is using an Arm-based architecture. Arm processors are lower power and are used across the industry for mobile devices, from phones to tablets and laptops.

Google Tensor will power new flagship phones that are expected to launch in October. (Google will reveal more details about those phones closer to launch.) That, too, is a strategy shift for Google, which in recent years has focused on affordability in its Pixel devices instead of offering high-end phones. And it shows that Google is again trying to compete directly in the flagship space against Apple and Samsung. The name Google Tensor is a nod to the name of Google's Tensor Processing Unit the company uses for cloud computing. It's a full system on a chip, or SoC, that the company says will offer big improvements to photo and video processing on phones, along with features like voice-to-speech and translation. And it includes a dedicated processor that runs artificial intelligence applications, in addition to a CPU, GPU and image signal processor. It will allow the phone to process more information on the device instead of having to send data to the cloud.
Further reading: Google's New Pixel Phones Features a Processor Designed In-House.
Intel

Intel Executive Posts Thunderbolt 5 Photo Then Deletes It (anandtech.com) 22

AnandTech: An executive visiting various research divisions across the globe isn't necessarily new, but with a focus on social media driving named individuals at each company to keep their followers sitting on the edge of their seats means that we get a lot more insights into how these companies operate. The downside of posting to social media is when certain images exposing unreleased information are not vetted by PR or legal, and we get a glimpse into the next generation of technology. That is what happened over the weekend.

EVP and GM of Intel's Client Computing Group, Gregory Bryant, last week spent some time at Intel's Israel R&D facilities in his first overseas Intel trip in of 2021. An early post on Sunday morning, showcasing Bryant's trip to the gym to overcome jetlag, was followed by another later in the day with Bryant being shown the offices and the research. The post contained four photos, but was rapidly deleted and replaced by a photo with three. The photo removed showcases some new information about next-generation Thunderbolt technology. In this image we can see a poster on the wall showcasing '80G PHY Technology,' which means that Intel is working on a physical layer (PHY) for 80 Gbps connections. Off the bat this is double the bandwidth of Thunderbolt 4, which runs at 40 Gbps.

The second line confirms that this is 'USB 80G is targeted to support the existing USB-C ecosystem,' which follows along that Intel is aiming to maintain the USB-C connector but double the effective bandwidth. The third line is actually where it gets technically interesting. 'The PHY will be based on novel PAM-3 modulation technology.' This is talking about how the 0 and 1s are transmitted -- traditionally we talk about NRZ encoding, which just allows for a 0 or a 1 to be transmitted, or a single bit. The natural progression is a scheme allowing two bits to be transferred, and this is called PAM-4 (Pulse Amplitude Modulation), with the 4 being the demarcation for how many different variants two bits could be seen (either as 00, 01, 10, or 11). PAM-4, at the same frequency, thus has 2x the bandwidth of an NRZ connection.

Intel

TSMC Will Start Making 2nm Chips As Intel Tries To Catch Up (gizmodo.com) 83

Kekke writes: "Taiwan Semiconductor Manufacturing Co.'s new foundry will produce 2-nanometer chips," reports Gizmodo. "Construction on the plant in Hsinchu, southwest from Taiwan's capital of Taipei, is expected to start as soon as early 2022. TSMC's 3nm tech is reportedly expected to be put into production in late 2022 -- meanwhile, Intel will be rolling out 7nm chips toward the end of 2022 and into 2023." Will Intel have a genie in the bottle or a rabbit in a hat? Doesn't seem so to me. On Tuesday, Intel unveiled a comeback plan designed to help it reclaim processor manufacturing leadership within four years.
Microsoft

Windows 11 Now Has Its First Beta Release (theverge.com) 49

Microsoft has released the first beta of Windows 11, available to those enrolled in its Windows Insider Program. From a report: Until today, getting access to Windows 11 meant installing the Dev preview, which Microsoft says is for "highly technical users" as it has "rough edges." According to Microsoft, the beta release is less volatile, with builds being validated by Microsoft (though it's still probably something you'll want to install on a test machine or second partition). Of course, to install the beta you'll need a compatible computer. Figuring out if your hardware will work with the next version of Windows has been notoriously tricky to pin down, but Microsoft's article about preparing for Insider builds directs people to its system requirements page. The company has said that it will be paying close attention to how well 7th Gen Intel and AMD Zen 1 CPUs work during the testing period, so it's possible those systems could be allowed to run the beta but not the final release.
Power

Dell Is Cancelling Alienware Gaming PC Shipments To Several US States (pcgamer.com) 86

davide marney writes: Orders for Alienware Aurora R12 and R10 gaming PC configurations placed in California, Colorado, Hawaii, Oregon, Vermont, or Washington will not be honored because of power consumption regulations, reports PC Gamer. "Any orders placed that are bound for those states will be canceled," Dell states in a message.

"The Aurora R12 and R10 are built around the latest generation processors from Intel and AMD, the former featuring 11th Gen Core Rocket Lake CPUs and the latter wielding Ryzen 5000 series chips based on Zen 3," reports PC Gamer. "Unfortunately for both Dell and buyers who reside in affected states, the majority of Aurora R12 and R10 configurations consume more power than local regulations allow. There are exceptions, though [depending on the configuration you select]."
Intel

Intel Details Comeback Plan To Leapfrog Chipmaking Rivals by 2025 (cnet.com) 72

Intel unveiled on Tuesday a smorgasbord of new technologies designed to help it reclaim processor manufacturing leadership within four years. The plans bear the fingerprints of newly installed CEO Pat Gelsinger, who has pledged to restore the company's engineering leadership and credibility. From a report: The developments include a new push to improve the power usage of Intel chips, a key element of battery life, while simultaneously raising chip performance. The technologies involve deep redesigns to how processors are constructed.

One technology, RibbonFET, fundamentally redesigns the transistor circuitry at the heart of all processors. Another, PowerVia, reimagines how electrical power is delivered to those transistors. Lastly, Intel is updating its Foveros technology for packaging chip elements from different sources into dense stacks of computing horsepower. Intel's commitments, unveiled at an online press event, will mean faster laptops with longer battery life, if realized. And the advancements could boost technologies like artificial intelligence at cloud computing companies and speed up the services on mobile phone networks. "In 2025, we think we will regain that performance crown," Sanjay Natarajan, who rejoined Intel this year to lead the company's processor technology development, said in an interview.
Further reading: Intel's foundry roadmap lays out the post-nanometer "Angstrom" era.
AMD

Leaked Intel i9-12900K Benchmark Shows Gains Over the Ryzen 5950X (digitaltrends.com) 90

UnknowingFool writes: An engineering sample of Intel's next flagship processor, the i9-12900K, was shown to beat AMD's current flagship 5950X in Cinebench R20 by 18% in multi-core and 28% in single-core tests. The next generation of Intel processors is believed to use a hybrid big.LITTLE design where 8 of its 16 cores are for low power usage and 8 are for full power. The low power cores only run in single thread where the high power cores can run 2 threads. No official word on pricing or release date from Intel though but engineering samples and B600 motherboards are being sold in China for $1,250 and $1,150, respectively. According to leaker OneRaichu, the results for the 12900K were gathered using water-cooling and without overclocking, so it's possible the final score could be even higher. The rumors suggest the processor will come with 16 cores and 24 threads with a boost clock speed of up to 5.3GHz.

Slashdot Top Deals