Intel

MSI Leaks Intel 14th Gen Core Specs, Confirms It's 3% Faster on Average Than 13th Gen (videocardz.com) 56

An anonymous reader shares a report: MSI made a mistake of sharing an unlisted video which has now leaked out. This leak involves a product training video showing MSI's latest Intel 700 motherboard series and upcoming PC cases. While the majority of the video focuses on improvements for these motherboards, there is a slide explaining Intel's next-gen Core series in a very short form. A slide (check the linked news post) has confirmed that Intel's 14th Gen Core series will see no major core count upgrades. Only one of the upcoming K-series CPUs will see a core change for its hybrid configuration of Performance and Efficient.

The Core i7-14700K is getting 12 Efficient cores, which is an upgrade over 8 E-cores the Core i7-13700K. The Core i9-14900K, now confirmed by MSI as well, will use the same 8P+16E configuration. This also applies to Core i5-14600K with 6P+8E config. [...] The CPU series will use the same Intel 7 process technology and will only provide higher DDR5 frequency support. The company confirms that Intel 14th Gen Core is 3% faster on average compared to current-gen series. The most important upgrade involves the Core i7-14700K which has up to 17% faster multi-threaded (MT) performance due to extra cores.

United Kingdom

UK To Spend $127M in Global Race To Produce AI Chips (theguardian.com) 24

The UK government will spend $127m to try to win a toe-hold for the nation in the global race to produce computer chips used to power artificial intelligence. From a report: Taxpayer money will be used as part of a drive to build a national AI resource in Britain, similar to those under development in the US and elsewhere. It is understood that the funds will be used to order key components from major chipmakers Nvidia, AMD and Intel. But an official briefed on the plans told the Guardian that the $127m offered by the government is far too low relative to investment by peers in the EU, US and China. The official confirmed, in a move first reported by the Telegraph, which also revealed the investment, that the government is in advanced stages of an order of up to 5,000 graphics processing units (GPUs) from Nvidia. The company, which started out building processing capacity for computer games, has seen a sharp increase in its value as the AI race has heated up. Its chips can run language learning models such as ChatGPT.
Intel

Intel Terminates Plan To Buy Tower Semiconductor (barrons.com) 16

Intel has dropped its planned $5.4 billion acquisition of Israel's Tower Semiconductor. It's a setback to Intel's plans to expand its chip-manufacturing business. From a report: Intel said Wednesday the deal, originally agreed in 2022, had been terminated due to delays in getting regulatory approval. Chinese regulators hadn't approved the deal by Tuesday's deadline. Intel will now have to pay a $353 million termination fee to Tower Semiconductor. However, the more painful consequence could be the blow to Intel's plans to build up its business making chips on contract for others via its Foundry Services unit.
Firefox

Does Desktop Linux Have a Firefox Problem? (osnews.com) 164

OS News' managing editor calls Firefox "the single most important desktop Linux application," shipping in most distros (with some users later opting for a post-installation download of Chrome).

But "I'm genuinely worried about the state of browsers on Linux, and the future of Firefox on Linux in particular..." While both GNOME and KDE nominally invest in their own two browsers, GNOME Web and Falkon, their uptake is limited and releases few and far between. For instance, none of the major Linux distributions ship GNOME Web as their default browser, and it lacks many of the features users come to expect from a browser. Falkon, meanwhile, is updated only sporadically, often going years between releases. Worse yet, Falkon uses Chromium through QtWebEngine, and GNOME Web uses WebKit (which are updated separately from the browser, so browser releases are not always a solid metric!), so both are dependent on the goodwill of two of the most ruthless corporations in the world, Google and Apple respectively.

Even Firefox itself, even though it's clearly the browser of choice of distributions and Linux users alike, does not consider Linux a first-tier platform. Firefox is first and foremost a Windows browser, followed by macOS second, and Linux third. The love the Linux world has for Firefox is not reciprocated by Mozilla in the same way, and this shows in various places where issues fixed and addressed on the Windows side are ignored on the Linux side for years or longer. The best and most visible example of that is hardware video acceleration. This feature has been a default part of the Windows version since forever, but it wasn't enabled by default for Linux until Firefox 115, released only in early July 2023. Even then, the feature is only enabled by default for users of Intel graphics — AMD and Nvidia users need not apply. This lack of video acceleration was — and for AMD and Nvidia users, still is — a major contributing factor to Linux battery life on laptops taking a serious hit compared to their Windows counterparts... It's not just hardware accelerated video decoding. Gesture support has taken much longer to arrive on the Linux version than it did on the Windows version — things like using swipes to go back and forward, or pinch to zoom on images...

I don't see anyone talking about this problem, or planning for the eventual possible demise of Firefox, what that would mean for the Linux desktop, and how it can be avoided or mitigated. In an ideal world, the major stakeholders of the Linux desktop — KDE, GNOME, the various major distributions — would get together and seriously consider a plan of action. The best possible solution, in my view, would be to fork one of the major browser engines (or pick one and significantly invest in it), and modify this engine and tailor it specifically for the Linux desktop. Stop living off the scraps and leftovers thrown across the fence from Windows and macOS browser makers, and focus entirely on making a browser engine that is optimised fully for Linux, its graphics stack, and its desktops. Have the major stakeholders work together on a Linux-first — or even Linux-only — browser engine, leaving the graphical front-end to the various toolkits and desktop environments....

I think it's highly irresponsible of the various prominent players in the desktop Linux community, from GNOME to KDE, from Ubuntu to Fedora, to seemingly have absolutely zero contingency plans for when Firefox enshittifies or dies...

Linux

Should There Be an 'Official' Version of Linux? (zdnet.com) 283

Why aren't more people using Linux on the desktop? Slashdot reader technology_dude shares one solution: Jack Wallen at ZDNet says establishing an "official" version of Linux may (or may not) help Linux on the desktop increase the number of users, mostly as someplace to point new users. It makes sense to me. What does Slashdot think and what would be the challenges, other than acceptance of a particular flavor?
Wallen argues this would also create a standard for hardware and software vendors to target, which "could equate to even more software and hardware being made available to Linux." (And an "official" Linux might also be more appealing to business users.) Wallen suggests it be "maintained and controlled by a collective of people from users, developers, and corporations (such as Intel and AMD) with a vested interest in the success of this project... There would also be corporate backing for things like marketing (such as TV commercials)." He also suggests basing it on Debian, and supporting both Snap and Flatpak...

In comments on the original submission, long-time Slashdot reader bobbomo points instead to kernel.org, arguing "There already is an official version of Linux called mainline. Everything else is backports." And jd (Slashdot user #1,658) believes that the official Linux is the Linux Standard Base. "All distributions, more-or-less, conform to the LSB, which gives you a pseudo 'official' Linux. About the one variable is the package manager. And there are ways to work around that."

Unfortunately, according to Wikipedia... The LSB standard stopped being updated in 2015 and current Linux distributions do not adhere to or offer it; however, the lsb_release command is sometimes still available.[citation needed] On February 7, 2023, a former maintainer of the LSB wrote, "The LSB project is essentially abandoned."
That post (on the lsb-discuss mailing list) argues the LSB approach was "partially superseded" by Snaps and Flatpaks (for application portability and stability). And of course, long-time Slashdot user menkhaura shares the obligatory XKCD comic...

It's not exactly the same thing, but days after ZDNet's article, CIQ, Oracle, and SUSE announced the Open Enterprise Linux Association, a new collaborative trade association to foster "the development of distributions compatible with Red Hat Enterprise Linux."

So where does that leave us? Share your own thoughts in the comments.

And should there be an "official" version of Linux?
Intel

Intel's GPU Drivers Now Collect Telemetry, Including 'How You Use Your Computer' (extremetech.com) 44

An anonymous reader quotes a report from ExtremeTech: Intel has introduced a telemetry collection service by default in the latest beta driver for its Arc GPUs. You can opt out of it, but we all know most people just click "yes" to everything during a software installation. Intel's release notes for the drivers don't mention this change to how its drivers work, which is a curious omission. News of Intel adding telemetry collection to its drivers is a significant change to how its GPU drivers work. Intel has even given this new collation routine a cute name -- the Intel Computing Improvement Program. Gee, that sounds pretty wonderful. We want to improve our computing, so let's dive into the details briefly.

According to TechPowerUp, which discovered the change, Intel has created a landing page for the program that explains what is collected and what isn't. At a high level, it states, "This program uses information about your computer's performance to make product improvements that may benefit you in the future." Though that sounds innocuous, Intel provides a long list of the types of data it collects, many unrelated to your computer's performance. Those include the types of websites you visit, which Intel says are dumped into 30 categories and logged without URLs or information that identifies you, including how long and how often you visit certain types of sites. It also collects information on "how you use your computer" but offers no details. It will also identify "Other devices in your computing environment." Numerous performance-related data points are also captured, such as your CPU model, display resolution, how much memory you have, and, oddly, your laptop's average battery life.
The good news is that Intel allows you to opt out of this program, which is not the case with Nvidia. According to TechPowerUp, they don't even ask for permission! As for AMD, they not only give you a choice to opt out but they also explain what data they're collecting.
Intel

Intel DOWNFALL: New Vulnerability In AVX2/AVX-512 With Big Performance Hits (phoronix.com) 68

An anonymous reader quotes a report from Phoronix: This Patch Tuesday brings a new and potentially painful processor speculative execution vulnerability... Downfall, or as Intel prefers to call it is GDS: Gather Data Sampling. GDS/Downfall affects the gather instruction with AVX2 and AVX-512 enabled processors. At least the latest-generation Intel CPUs are not affected but Tigerlake / Ice Lake back to Skylake is confirmed to be impacted. There is microcode mitigation available but it will be costly for AVX2/AVX-512 workloads with GATHER instructions in hot code-paths and thus widespread software exposure particularly for HPC and other compute-intensive workloads that have relied on AVX2/AVX-512 for better performance.

Downfall is characterized as a vulnerability due to a memory optimization feature that unintentionally reveals internal hardware registers to software. With Downfall, untrusted software can access data stored by other programs that typically should be off-limits: the AVX GATHER instruction can leak the contents of the internal vector register file during speculative execution. Downfall was discovered by security researcher Daniel Moghimi of Google. Moghimi has written demo code for Downfall to show 128-bit and 256-bit AES keys being stolen from other users on the local system as well as the ability to steal arbitrary data from the Linux kernel. Skylake processors are confirmed to be affected through Tiger Lake on the client side or Xeon Scalable Ice Lake on the server side. At least the latest Intel Alder Lake / Raptor Lake and Intel Xeon Scalable Sapphire Rapids are not vulnerable to Downfall. But for all the affected generations, CPU microcode is being released today to address this issue.

Intel acknowledges that their microcode mitigation for Downfall will have the potential for impacting performance where gather instructions are in an applications' hot-path. In particular given the AVX2/AVX-512 impact with vectorization-heavy workloads, HPC workloads in particular are likely to be most impacted but we've also seen a lot of AVX use by video encoding/transcoding, AI, and other areas. Intel has not relayed any estimated performance impact claims from this mitigation. Well, to the press. To other partners Intel has reportedly communicated a performance impact up to 50%. That is for workloads with heavy gather instruction use as part of AVX2/AVX-512. Intel is being quite pro-active in letting customers know they can disable the microcode change if they feel they are not to be impacted by Downfall. Intel also believes pulling off a Downfall attack in the real-world would be a very difficult undertaking. However, those matters are subject to debate.
Intel's official security disclosure is available here. The Downfall website is downfall.page.
Businesses

Germany Spends Big To Win $11 Billion TSMC Chip Plant (reuters.com) 35

TSMC is committing $3.8 billion to establish its first European factory in Germany, benefiting from significant state support for the $11 billion project as Europe aims to shorten supply chains. Reuters reports: The plant, which will be TSMC's third outside of traditional manufacturing bases Taiwan and China, is central to Berlin's ambition to foster the domestic semiconductor industry its car industry will need to remain globally competitive. Germany, which has been courting the world's largest contract chipmaker since 2021, will contribute up to 5 billion euros to the factory in Dresden, capital of the eastern state of Saxony, German officials said.

"Germany is now probably becoming the major location for semiconductor production in Europe," German Chancellor Olaf Scholz said, less than two months after Intel announced a 30 billion euro plan to build two chip-making plants in the country. "That is important for the resilience of production structures around the world, but it is also important for the future viability of our European continent, and it is of course particularly important for the future viability of Germany."

TSMC said it would invest up to 3.499 billion euros into a subsidiary, European Semiconductor Manufacturing Company (ESMC), of which it will own 70%. Germany's Bosch and Infineon and the Netherlands' NXP (NXPI.O) will each own 10% of the plant, which will make up to 40,000 wafers a month for cars and industrial and home products when it opens in 2017. The factory will cost around 10 billion euros in total.

AI

Companies Double Down on AI in June-Quarter Analyst Calls (reuters.com) 12

It's a high bar, but companies reporting second-quarter earnings in recent weeks have talked up artificial intelligence even more than in the previous quarter. From a report: S&P 500 companies that led in discussion of AI during quarterly conference calls with analysts earlier this year have outdone themselves in their latest quarterly calls. Following Intel's report late on Thursday, executives and analysts on its call mentioned AI 58 times, up from 15 mentions in its previous call in April.

Intel so far has missed out on the boom in components for AI computing, and sales in its data center and AI business fell 15% in the second quarter. Intel is now rushing to catch up with Nvidia and other rivals whose chips enable the technology behind ChatGPT. A 6.6% surge in Intel's shares on Friday following its report was due to optimism about a recovery in weak demand for personal computers.

Participants on Alphabet's analyst call on Tuesday mentioned AI 62 times, up from 52 times three months ago. The same day, AI was mentioned 58 times on Microsoft's call, up from 35 times in its previous call. The recent surge in companies talking about their plans related to AI reflects Wall Street's recent overwhelming optimism about using generative AI and related technologies to offer new services and boost efficiency across a spectrum of industries. That has helped fuel a 37% surge in the Nasdaq this year and a 20% gain in the S&P 500.

Intel

Intel Returns To Profitability After Two Quarters of Losses (cnbc.com) 21

Intel reported second-quarter earnings on Thursday, including a return to profitability after two straight quarters of losses, and a stronger-than-expected forecast. CNBC reports: For the third quarter, Intel expects earnings of $0.20 per share, adjusted, on revenue of $13.4 billion at the midpoint, versus analyst expectations of 16 cents per share on $13.23 billion in sales. Intel posted net income of $1.5 billion, or earnings of $0.35 per share, versus a net loss of $454 million, or a loss of 11 cents per share, in the same quarter last year.

Intel CFO David Zinsner said in a statement that part of the reason that Intel's report was stronger than expected was because of the progress it has made towards slashing $3 billion in costs this year. Earlier this year, Intel slashed its dividend and announced plans to save $10 billion per year by 2025, including through layoffs. Revenue fell to $12.9 billion from $15.3 billion a year ago, marking the sixth consecutive quarter of declining sales for the company.

Here's how Intel's business units performed:
- Intel's Client Computing group, which includes the company's laptop and desktop processor shipments, fell 12% annually to $6.8 billion.The overall PC market has been slumping for over a year.
- Intel's server chip division, which is reported as Data Center and AI, declined 15% to $4.0 billion in sales.
- Intel's Network and Edge division, which sells networking products for telecommunications, declined 28% to $1.4 billion.
- Mobileye, a publicly-traded Intel subsidiary focusing on self-driving cars, saw sales down 1% on an annual basis to $454 million.
- It reported $232 million in revenue for its foundry business, Intel Foundry Services, that makes chips for other companies.

EU

EU Enacts $48 Billion Chips Act in Bid To Boost Production (bloomberg.com) 21

The European Union's plan to bolster domestic semiconductor production will become law after ministers completed the final approval on Tuesday. From a report: The EU's Chips Act, which was approved by the European Parliament earlier this month, will take effect once it's published in the bloc's Official Journal. The European Commission first proposed the $47.5 billion Chips Act as part of an ambitious goal of producing 20% of the world's semiconductors by 2030. Numerous companies, including Intel and STMicroelectronics, have already announced new sites in Europe.
Intel

ASUS Will Manufacture and Develop New Intel NUC Mini PCs (engadget.com) 9

Intel has announced ASUS as the company's first partner for its Next Unit of Compute (NUC) mini PC business. From a report: The two companies have entered a non-binding agreement that will see ASUS manufacture, sell and support the 10th- to 13th-generation products in Intel's NUC line. ASUS will also develop future NUC designs. Based on the business' current lineup, ASUS could be developing future NUC mini PCs, DIY kits for mini PCs, DIY kits for laptops, customizable boards, chassis and other assembly elements.

If you'll recall, Intel recently told Engadget that it's ending its "direct investment" in its NUC business and will no longer produce first-party NUC products. It didn't elaborate on its reasoning, but working with partners for a non-essential business will free up resources it could use to concentrate on making chips. Intel previously said its first quarter earnings exceeded expectations, but its revenue was still down 36 percent year-over-year when compared to its results in the same period for 2022. The company also said that it remains cautious in this economy.

Intel

How Long Will the Last Intel Macs Be Supported? macOS Sonoma Gives Us Some Hints 72

An anonymous reader shares a report: A year ago, we compiled a model list of Macs spanning over two decades, complete with their launch dates, discontinuation dates, and all the available information about the macOS updates each model received. We were trying to answer two questions: How long can Mac owners reasonably expect to receive software updates when they buy a new computer? And were Intel Macs being dropped more aggressively now that the Apple Silicon transition was in full swing? The answer to the second question was a tentative "yes," and now that we know the official support list for macOS Sonoma, the trendline is clear.

Macs introduced between 2009 and 2015 could expect to receive seven or eight years of macOS updates -- that is, new major versions with new features, like Ventura or Sonoma -- plus another two years of security-only updates that fix vulnerabilities and keep Safari up to date. Macs released in 2016 and 2017 are only receiving about six years' worth of macOS updates, plus another two years of security updates. That's about a two-year drop, compared to most Macs released between 2009 and 2013. The last of the Intel Macs are still on track to be supported for longer than the last PowerPC Macs were in the mid-to-late 2000s, but they're getting fewer years of software update support than any other Macs released in the last 15 years.
AI

Bill Gates Calls AI's Risks 'Real But Manageable' (gatesnotes.com) 57

This week Bill Gates said "there are more reasons than not to be optimistic that we can manage the risks of AI while maximizing their benefits." One thing that's clear from everything that has been written so far about the risks of AI — and a lot has been written — is that no one has all the answers. Another thing that's clear to me is that the future of AI is not as grim as some people think or as rosy as others think. The risks are real, but I am optimistic that they can be managed. As I go through each concern, I'll return to a few themes:

- Many of the problems caused by AI have a historical precedent. For example, it will have a big impact on education, but so did handheld calculators a few decades ago and, more recently, allowing computers in the classroom. We can learn from what's worked in the past.

— Many of the problems caused by AI can also be managed with the help of AI.

- We'll need to adapt old laws and adopt new ones — just as existing laws against fraud had to be tailored to the online world.

Later Gates adds that "we need to move fast. Governments need to build up expertise in artificial intelligence so they can make informed laws and regulations that respond to this new technology."

But Gates acknowledged and then addressed several specific threats:
  • He thinks AI can be taught to recognize its own hallucinations. "OpenAI, for example, is doing promising work on this front.
  • Gates also believes AI tools can be used to plug AI-identified security holes and other vulnerabilities — and does not see an international AI arms race. "Although the world's nuclear nonproliferation regime has its faults, it has prevented the all-out nuclear war that my generation was so afraid of when we were growing up. Governments should consider creating a global body for AI similar to the International Atomic Energy Agency."
  • He's "guardedly optimistic" about the dangers of deep fakes because "people are capable of learning not to take everything at face value" — and the possibility that AI "can help identify deepfakes as well as create them. Intel, for example, has developed a deepfake detector, and the government agency DARPA is working on technology to identify whether video or audio has been manipulated."
  • "It is true that some workers will need support and retraining as we make this transition into an AI-powered workplace. That's a role for governments and businesses, and they'll need to manage it well so that workers aren't left behind — to avoid the kind of disruption in people's lives that has happened during the decline of manufacturing jobs in the United States."

Gates ends with this final thought:

"I encourage everyone to follow developments in AI as much as possible. It's the most transformative innovation any of us will see in our lifetimes, and a healthy public debate will depend on everyone being knowledgeable about the technology, its benefits, and its risks.

"The benefits will be massive, and the best reason to believe that we can manage the risks is that we have done it before."


Privacy

SEO Expert Hired and Fired By Ashley Madison Turned on Company, Promising Revenge (krebsonsecurity.com) 28

In July 2015, the marital infidelity website AshleyMadison.com was hacked by a group called the Impact Team, threatening to release data on all 37 million users unless the site shut down. In an article published earlier today, security researcher Brian Krebs explores the possible involvement of a former employee and self-describe expert in search engine optimization (SEO), William Brewster Harrison, who had a history of harassment towards then-CEO Noel Biderman and may have had the technical skills to carry out the hack. However, Harrison committed suicide in 2014, raising doubts about his role in the breach. Here's an excerpt from the report: [...] Does Harrison's untimely death rule him out as a suspect, as his stepmom suggested? This remains an open question. In a parting email to Biderman in late 2012, Harrison signed his real name and said he was leaving, but not going away. "So good luck, I'm sure we'll talk again soon, but for now, I've got better things in the oven," Harrison wrote. "Just remember I outsmarted you last time and I will outsmart you and out maneuver you this time too, by keeping myself far far away from the action and just enjoying the sideline view, cheering for the opposition." Nothing in the leaked Biderman emails suggests that Ashley Madison did much to revamp the security of its computer systems in the wake of Harrison's departure and subsequent campaign of harassment -- apart from removing an administrator account of his a year after he'd already left the company.

KrebsOnSecurity found nothing in Harrison's extensive domain history suggesting he had any real malicious hacking skills. But given the clientele that typically employed his skills -- the adult entertainment industry -- it seems likely Harrison was at least conversant in the dark arts of "Black SEO," which involves using underhanded or else downright illegal methods to game search engine results. Armed with such experience, it would not have been difficult for Harrison to have worked out a way to maintain access to working administrator accounts at Ashley Madison. If that in fact did happen, it would have been trivial for him to sell or give those credentials to someone else. Or to something else. Like Nazi groups. As KrebsOnSecurity reported last year, in the six months leading up to the July 2015 hack, Ashley Madison and Biderman became a frequent subject of derision across multiple neo-Nazi websites.

Some readers have suggested that the data leaked by the Impact Team could have originally been stolen by Harrison. But that timeline does not add up given what we know about the hack. For one thing, the financial transaction records leaked from Ashley Madison show charges up until mid-2015. Also, the final message in the archive of Biderman's stolen emails was dated July 7, 2015 -- almost two weeks before the Impact Team would announce their hack. Whoever hacked Ashley Madison clearly wanted to disrupt the company as a business, and disgrace its CEO as the endgame. The Impact Team's intrusion struck just as Ashley Madison's parent was preparing go public with an initial public offering (IPO) for investors. Also, the hackers stated that while they stole all employee emails, they were only interested in leaking Biderman's. Also, the Impact Team had to know that ALM would never comply with their demands to dismantle Ashley Madison and Established Men. In 2014, ALM reported revenues of $115 million. There was little chance the company was going to shut down some of its biggest money machines. Hence, it appears the Impact Team's goal all along was to create prodigious amounts of drama and tension by announcing the hack of a major cheating website, and then let that drama play out over the next few months as millions of exposed Ashley Madison users freaked out and became the targets of extortion attacks and public shaming.

After the Impact Team released Biderman's email archives, several media outlets pounced on salacious exchanges in those messages as supposed proof he had carried on multiple affairs. Biderman resigned as CEO of Ashley Madison on Aug. 28, 2015. Complicating things further, it appears more than one malicious party may have gained access to Ashley's Madison's network in 2015 or possibly earlier. Cyber intelligence firm Intel 471 recorded a series of posts by a user with the handle "Brutium" on the Russian-language cybercrime forum Antichat between 2014 and 2016. Brutium routinely advertised the sale of large, hacked databases, and on Jan. 24, 2015, this user posted a thread offering to sell data on 32 million Ashley Madison users. However, there is no indication whether anyone purchased the information. Brutium's profile has since been removed from the Antichat forum.
Note: This is Part II of a story published last week on reporting that went into a new Hulu documentary series on the 2015 Ashley Madison hack.
Intel

Intel Kills Its NUC Line (pcworld.com) 67

Intel has decided to stop making its Next Unit of Computing (NUC), but the company will encourage partners to keep making the small form-factor (SFF) PCs, the company said Tuesday. From a report: Intel's NUC championed compact PCs, while leaving larger chassis options to partners like Dell and HP. But Intel's decision seems like a natural one, given that Intel has refocused on its core businesses during a period in which it also invested heavily in its own manufacturing operations and foundry business.

An Intel spokesman confirmed an initial report by Serve The Home, saying that Intel will continue to support the existing NUCs it has already shipped into the market. "We have decided to stop direct investment in the Next Unit of Compute (NUC) Business and pivot our strategy to enable our ecosystem partners to continue NUC innovation and growth," the Intel spokesman said in an email.

Programming

Why Are There So Many Programming Languages? (acm.org) 160

Long-time Slashdot reader theodp writes: Recalling a past Computer History Museum look at the evolution of programming languages, Doug Meil ponders the age-old question of Why Are There So Many Programming Languages? in a new Communications of the ACM blog post.

"It's worth noting and admiring the audacity of PL/I (1964)," Meil writes, "which was aiming to be that 'one good programming language.' The name says it all: Programming Language 1. There should be no need for 2, 3, or 4. [Meil expands on this thought in Lessons from PL/I: A Most Ambitious Programming Language.] Though PL/I's plans of becoming the Highlander of computer programming didn't play out like the designers intended, they were still pulling on a key thread in software: why so many languages? That question was already being asked as far back as the early 1960's."

One of PL/I's biggest fans was Digital Research Inc. (DRI) founder Gary Kildall, who crafted the PL/I-inspired PL/M (Programming Language for Microcomputers) in 1973 for Intel. But IBM priced PL/I higher than the languages it sought to replace, contributing to PL/I's failure to gain traction. (Along the lines of how IBM's deal with Microsoft gave rise to a price disparity that was the undoing of Kildall's CP/M OS, bundled with every PC in a 'non-royalty' deal. Windows was priced at $40 while CP/M was offered 'a la carte' at $240.) As a comp.lang.pl1 poster explained in 2006, "The truth of the matter is that Gresham's Law: 'Bad money drives out good' or Ruskin's principle: 'The hoi polloi always prefer an inferior, cheap product over a superior, more expensive one' are what govern here."

Supercomputing

Inflection AI Develops Supercomputer Equipped With 22,000 Nvidia H100 AI GPUs 28

Inflection AI, an AI startup company, has built a cutting-edge supercomputer equipped with 22,000 NVIDIA H100 GPUs. Wccftech reports: For those unfamiliar with Inflection AI, it is a business that aims at creating "personal AI for everyone." The company is widely known for its recently introduced Inflection-1 AI model, which powers the Pi chatbot. Although the AI model hasn't yet reached the level of ChatGPT or Google's LaMDA models, reports suggest that Inflection-1 performs well on "common sense" tasks, making it much more suitable for applications such as personal assistance.
>
Coming back, Inflection announced that it is building one of the world's largest AI-based supercomputers, and it looks like we finally have a glimpse of what it would be. It is reported that the Inflection supercomputer is equipped with 22,000 H100 GPUs, and based on analysis, it would contain almost 700 four-node racks of Intel Xeon CPUs. The supercomputer will utilize an astounding 31 Mega-Watts of power.

The surprising fact about the supercomputer is the acquisition of 22,000 NVIDIA H100 GPUs. We all are well aware that, in recent times, it has been challenging to acquire even a single unit of the H100s since they are in immense demand, and NVIDIA cannot cope with the influx of orders. In the case of Inflection AI, NVIDIA is considering being an investor in the company, which is why in their case, it is easier to get their hands on such a massive number of GPUs.
Firefox

Firefox 115 Released (mozilla.org) 61

williamyf writes: Today, Mozilla released Firefox 115. Changes most visible to users include:

* Hardware video decoding is now enabled for Intel GPUs on Linux..

* Migrating from another browser? Now you can bring over payment methods you've saved in Chrome-based browsers to Firefox.

* The Tab Manager dropdown now features close buttons, so you can close tabs more quickly.

* The Firefox for Android address bar's new search button allows you to easily switch between search engines and search your bookmarks and browsing history.

* We've refreshed and streamlined the user interface for importing data in from other browsers.

* Users without platform support for H264 video decoding can now fallback to Cisco's OpenH264 plugin for playback.

But the most important feature is that this release is the new ESR. Why this is important? y'all ask, well:

* Many a "downstream" project depends on Firefox ESR, for example the famous email client Thunderbird, or KaiOS (a mobile OS very popular in India, SE Asia, Africa and LatAm), so, for better or worse, whatever made it to (or is lacking from) this version of the browser, those projects have to use for the next year.

* Firefox ESR is the default browser of many distros, like Debian and Kali Linux, so, whatever made it to this version will be there for next year, ditto to whatever is lacking.

* If you are on old -- unsupported OSs, like Windows 7, 8-8.1 or MacOS 10.14 (Mojave, the last MacOS with support for 32 Bit Apps), 10.13 or 10.12 you will automatically be migrated to Firefox ESR, so this will be your browser until Sept. 2024.


AMD

AMD CPU Use Among Linux Gamers Approaching 70% Marketshare (phoronix.com) 127

The June Steam Survey results show that AMD CPUs have gained significant popularity among Linux gamers, with a market share of 67% -- a remarkable 7% increase from the previous month. Phoronix reports: In part that's due to the Steam Deck being powered by an AMD SoC but it's been a trend building for some time of AMD's increasing Ryzen CPU popularity among Linux users to their open-source driver work and continuing to build more good will with the community.

In comparison, last June the AMD CPU Linux gaming marketshare came in at 45% while Intel was at 54%. Or at the start of 2023, AMD CPUs were at a 55% marketshare among Linux gamers. Or if going back six years, AMD CPU use among Linux gamers was a mere 18% during the early Ryzen days. It's also the direct opposite on the Windows side. When looking at the Steam Survey results for June limited to Windows, there Intel has a 68% marketshare to AMD at 32%.

Beyond the Steam Deck, it's looking like AMD's efforts around open-source drivers, AMD expanding their Linux client (Ryzen) development efforts over the past two years, promises around OpenSIL, and other efforts commonly covered on Phoronix are paying off for AMD in wooing over their Linux gaming customer base.

Slashdot Top Deals