×
Microsoft

Microsoft's 'Netflix-for-Gaming' Service Launches on iPhone and PC This Week (cnbc.com) 28

Microsoft's Xbox Cloud Gaming service, previously known as xCloud, will begin rolling out in beta to iPhones, iPads and PCs this week. The service will be invite-only to start, Microsoft said in a blog post on Monday. From a report: Xbox Cloud Gaming was on track to launch for iPhones and iPads earlier, but Apple updated its App Store rules in September that impacted services like Xbox Gaming and Google Stadia. Apple's move forced the companies to use web browsers to redesign their services so that they could circumvent the App Store rules. Under the rules, Microsoft, Google and other companies with similar services would have had to offer each game as an individual download instead of offering a complete library the way Netflix does for movies.

Xbox Cloud Gaming is sort of like Netflix for games. People who subscribe to Microsoft's $14.99/month Xbox Game Pass Ultimate plan can access more than 100 titles. The cloud gaming aspect lets you stream the games without having to download them, provided you have a fast enough internet connection. The streaming option is already available for Android phones.

Programming

Student's First Academic Paper Solves Decades-Old Quantum Computing Problem (abc.net.au) 92

"Sydney university student Pablo Bonilla, 21, had his first academic paper published overnight and it might just change the shape of computing forever," writes Australia's national public broadcaster ABC: As a second-year physics student at the University of Sydney, Mr Bonilla was given some coding exercises as extra homework and what he returned with has helped to solve one of the most common problems in quantum computing. His code spiked the interest of researchers at Yale and Duke in the United States and the multi-billion-dollar tech giant Amazon plans to use it in the quantum computer it is trying to build for its cloud platform Amazon Web Services....

Assistant professor Shruti Puri of Yale's quantum research program said the new code solved a problem that had persisted for 20 years. "What amazes me about this new code is its sheer elegance," she said. "Its remarkable error-correcting properties are coming from a simple modification to a code that has been studied extensively for almost two decades...."

Co-author of the paper, the University of Sydney's Ben Brown, said the brilliance of Pablo Bonilla's code was in its simplicity... "We just made the smallest of changes to a chip that everybody is building, and all of a sudden it started doing a lot better. It's quite amazing to me that nobody spotted it in the 20-or-so years that people have been working on that model."

AI

Google Researchers Boost Speech Recognition Accuracy With More Datasets 15

What if the key to improving speech recognition accuracy is simply mixing all available speech datasets together to train one large AI model? That's the hypothesis behind a recent study published by a team of researchers affiliated with Google Research and Google Brain. They claim an AI model named SpeechStew that was trained on a range of speech corpora achieves state-of-the-art or near-state-of-the-art results on a variety of speech recognition benchmarks. VentureBeat reports: In pursuit of a solution, the Google researchers combined all available labeled and unlabelled speech recognition data curated by the community over the years. They drew on AMI, a dataset containing about 100 hours of meeting recordings, as well as corpora that include Switchboard (approximately 2,000 hours of telephone calls), Broadcast News (50 hours of television news), Librispeech (960 hours of audiobooks), and Mozilla's crowdsourced Common Voice. Their combined dataset had over 5,000 hours of speech -- none of which was adjusted from its original form. With the assembled dataset, the researchers used Google Cloud TPUs to train SpeechStew, yielding a model with more than 100 million parameters. In machine learning, parameters are the properties of the data that the model learned during the training process. The researchers also trained a 1-billion-parameter model, but it suffered from degraded performance.

Once the team had a general-purpose SpeechStew model, they tested it on a number of benchmarks and found that it not only outperformed previously developed models but demonstrated an ability to adapt to challenging new tasks. Leveraging Chime-6, a 40-hour dataset of distant conversations in homes recorded by microphones, the researchers fine-tuned SpeechStew to achieve accuracy in line with a much more sophisticated model. Transfer learning entails transferring knowledge from one domain to a different domain with less data, and it has shown promise in many subfields of AI. By taking a model like SpeechStew that's designed to understand generic speech and refining it at the margins, it's possible for AI to, for example, understand speech in different accents and environments.
Earth

Google Earth Now Shows Decades of Climate Change in Seconds (bloomberg.com) 66

Google Earth has partnered with NASA, the U.S. Geological Survey, the EU's Copernicus Climate Change Service, and Carnegie Mellon University's CREATE Lab to bring users time-lapse images of the planet's surface -- 24 million satellite photos taken over 37 years. Together they offer photographic evidence of a planet changing faster than at any time in millennia. Shorelines creep in. Cities blossom. Trees fall. Water reservoirs shrink. Glaciers melt and fracture. From a report: "We can objectively see global warming with our own eyes," said Rebecca Moore, director of Google Earth. "We hope that this can ground everyone in an objective, common understanding of what's actually happening on the planet, and inspire action." Timelapse, the name of the new Google Earth feature, is the largest video on the planet, according to a statement from the company, requiring 2 million hours to process in cloud computers, and the equivalent of 530,000 high-resolution videos. The tool stitches together nearly 50 years of imagery from the U.S.'s Landsat program, which is run by NASA and the USGS. When combined with images from complementary European Sentinel-2 satellites, Landsat provides the equivalent of complete coverage of the Earth's surface every two days. Google Earth is expected to update Timelapse about once a year.
Open Source

Inspur, China's Largest Cloud Hardware Vendor, Joins Open-Source Patent Consortium (zdnet.com) 7

An anonymous reader quotes a report from ZDNet: The Open Invention Network (OIN) defends the intellectual property (IP) rights of Linux and open-source software developers from patent trolls and the like. This is a global fight and now the OIN has a new, powerful allied member in China: Inspur. Inspur is a leading worldwide provider and China's leading data center infrastructure, cloud computing, and artificial intelligence (AI) server providers. While not a household name like Lenovo, Inspur ranks among the world's top-three server manufacturers.

Inspur is only the latest of many companies to join the OIN. Besides such primarily hardware-oriented companies as Inspur, Baidu, China's largest search engine company, and global banks such as Barclays and the TD Bank Group, have joined the OIN. In 2021, companies far removed from traditional Linux companies such as Canonical, Red Hat, and SUSE all recognize Linux and OSS's importance. Donny Zhang, VP of Inspur information, said, "Linux and open source are critical elements in technologies which we are developing and provisioning. By joining the Open Invention Network, we are demonstrating our continued commitment to innovation, and supporting it with patent non-aggression in core Linux and adjacent open-source software."
"Linux is rewriting what is possible in infrastructure computing," says OIN CEO Keith Bergelt. "OSS-based cloud computing and on-premise data centers are driving down the cost-per-compute while significantly increasing businesses' ability to provision AI and machine-learning (ML) capabilities. We appreciate Inspur's participation in joining OIN and demonstrating its commitment to innovation and patent non-aggression in open source."
Intel

Nvidia To Make CPUs, Going After Intel (bloomberg.com) 111

Nvidia said it's offering the company's first server microprocessors, extending a push into Intel's most lucrative market with a chip aimed at handling the most complicated computing work. Intel shares fell more than 2% on the news. From a report: The graphics chipmaker has designed a central processing unit, or CPU, based on technology from Arm, a company it's trying to acquire from Japan's SoftBank Group. The Swiss National Supercomputing Centre and U.S. Department of Energy's Los Alamos National Laboratory will be the first to use the chips in their computers, Nvidia said Monday at an online event. Nvidia has focused mainly on graphics processing units, or GPUs, which are used to power video games and data-heavy computing tasks in data centers. CPUs, by contrast, are a type of chip that's more of a generalist and can do basic tasks like running operating systems. Expanding into this product category opens up more revenue opportunities for Nvidia.

Founder and Chief Executive Officer Jensen Huang has made Nvidia the most valuable U.S. chipmaker by delivering on his promise to give graphics chips a major role in the explosion in cloud computing. Data center revenue contributes about 40% of the company's sales, up from less than 7% just five years ago. Intel still has more than 90% of the market in server processors, which can sell for more than $10,000 each. The CPU, named Grace after the late pioneering computer scientist Grace Hopper, is designed to work closely with Nvidia graphics chips to better handle new computing problems that will come with a trillion parameters. Systems working with the new chip will be 10 times faster than those currently using a combination of Nvidia graphics chips and Intel CPUs. The new product will be available at the beginning of 2023, Nvidia said.

Microsoft

Microsoft is Acquiring Nuance Communications for $19.7 Billion (techcrunch.com) 19

Microsoft agreed today to acquire Nuance Communications, a leader in speech to text software, for $19.7 billion. From a report: In a post announcing the deal, the company said this was about increasing its presence in the healthcare vertical, a place where Nuance has done well in recent years. In fact, the company announced the Microsoft Cloud for Healthcare last year, and this deal is about accelerating its presence there. Nuance's products in this area include Dragon Ambient eXperience, Dragon Medical One and PowerScribe One for radiology reporting. "Today's acquisition announcement represents the latest step in Microsoft's industry-specific cloud strategy," the company wrote. The acquisition also builds on several integrations and partnerships the two companies have made in the last couple of years. The company boasts 10,000 healthcare customers, according to information on the website. Those include AthenaHealth, Johns Hopkins, Mass General Brigham and Cleveland Clinic to name but a few, and it was that customer base that attracted Microsoft to pay the price it did to bring Nuance into the fold.
Government

Would You Tell an Angel Investor How to Start a New Country? (1729.com) 59

Angel investor Balaji S. Srinivasan (also the former CTO of Coinbase) is now focused on 1729.com, which wants to give you money to do his bidding — or something like that. He's calling it "the first newsletter that pays you.

"It has a regular feed of paid tasks and tutorials with $1000+ in crypto prizes per day, and doubles as a vehicle for distributing a new book I've been writing called The Network State."

His latest post? "How to Start a New Country" (which envisions starting with a "cloud first" digital community): We recruit online for a group of people interested in founding a new virtual social network, a new city, and eventually a new country. We build the embryonic state as an open source project, we organize our internal economy around remote work, we cultivate in-person levels of civility, we simulate architecture in VR, and we create art and literature that reflects our values.

Over time we eventually crowdfund territory in the real world, but not necessarily contiguous territory. Because an under-appreciated fact is that the internet allows us to network enclaves. Put another way, a cloud community need not acquire all its territory in one place at one time. It can connect a thousand apartments, a hundred houses, and a dozen cul-de-sacs in different cities into a new kind of fractal polity with its capital in the cloud. Over time, community members migrate between these enclaves and crowdfund territory nearby, with every individual dwelling and group house presenting an independent opportunity for expansion...

[Cloud countries] are set up to be a scaled live action role-playing game (LARP), a feat of imagination practiced by large numbers of people at the same time. And the experience of cryptocurrencies over the last decade shows us just how powerful such a shared LARP can be...

The cloud country concept "just" requires stacking together many existing technologies, rather than inventing new ones like Mars-capable rockets or permanent-habitation seasteads. Yet at the same time it avoids the obvious pathways of election, revolution, and war — all of which are ugly and none of which provide much venue for individual initiative...

Could a sufficiently robust cloud country with, say, 1-10M committed digital citizens, provable cryptocurrency reserves, and physical holdings all over the earth similarly achieve societal recognition from the United Nations?

For the "do his bidding" part, the post promises that up to ten $100 prizes will be awarded to people who share constructive reviews on their sites/social media pages (including proposals for extensions).

Previously the site had offered $100 for the ten best hirelings "running a newsletter for technological progressives at your own domain, as a way to begin incentivizing the decentralization of media." (It cited a tweet that argues succinctly that "The NYT is telling anti-longevity stories for us. We must take control of our own story.") In general the site describes itself as "a newsletter for technological progressives. That means people who are into cryptocurrencies, startup cities, mathematics, transhumanism, space travel, reversing aging, and initially-crazy-seeming-but-technologically-feasible ideas." So the newsletter-creating task had envisioned them all "constantly pushing for technology in general and reversing aging in particular, writing like their lives depended on it. In other words, blog or die!"

Other rewards went to the first 10 people to complete three Elixir problems, the 100 people who posted the best inspiring proof-of-exercising photos, and 40 people who helped identify people and places "where the ascending world is surpassing the declining world."

For one of his latest "tasks," Srinivasan wants you to read a long essay on quantum computing (and answer questions), with an optional series of "review emails". $10 in bitcoin will be awarded only to the first and last 50 readers/question-answerers, while another $100 in bitcoin will be awarded to the first and last 5 review-email readers who "persist for a month."
Crime

US Arrests Suspect Who Wanted To Blow Up AWS Data Center (therecord.media) 151

An anonymous reader quotes a report from The Record: The FBI has arrested on Thursday a Texas man who planned to blow up one of the Amazon Web Services (AWS) data centers in an attempt to "kill of about 70% of the internet." Seth Aaron Pendley, 28, of Wichita Falls, Texas, was arraigned in front of a Texas judge today and formally indicted with a malicious attempt to destroy a building with an explosive.

The US Department of Justice said Pendley was arrested on Thursday after he tried to acquire C-4 plastic explosives from an undercover FBI employee in Fort Worth, Texas. The FBI said they learned of Pendley's plans after the suspect confided in January 2021 via Signal, an encrypted communications app, to a third-party source about plans to blow up one of Amazon's Virginia-based data centers. The source alerted the FBI and introduced the suspect to the undercover agent on March 31.
"The suspect allegedly told an FBI agent that he wanted to attack Amazon's data center because the company was providing web servers to the FBI, CIA, and other federal agencies and that he hoped to bring down 'the oligarchy' currently in power in the United States," the report says.

Pendley could face up to 20 years in federal prison if he's found guilty and convicted.
Windows

Microsoft Is Finally Releasing a 64-Bit Version of OneDrive For Windows (engadget.com) 75

Microsoft is finally releasing a 64-bit version of OneDrive, roughly 14 years after the first 64-bit version of Windows was released. Engadget reports: In an announcement spotted by Windows guru Paul Thurrott, the company says the new version of OneDrive will help those who need to transfer large files or many files at the same time since 64-bit systems can access more resources than their 32-bit counterparts.

"We know this has been a long-awaited and highly requested feature, and we're thrilled to make it available for early access," the company said. "You can now download the 64-bit version for use with OneDrive work, school, and home accounts." One thing to note is the preview is currently only available on x64 installs of Windows. If you own a computer like the Surface Pro X -- and therefore have Windows 10 on ARM installed on your system -- you'll have to wait. Microsoft recommends you continue using the 32-bit version for the time being.

Software

UK Software Reseller Sues Microsoft For $370 Million (ft.com) 57

A British company is suing Microsoft for $370m in damages [Editor's note: the link may be paywalled; alternative source] in the English High Court, alleging that the US company is trying to crush a multibillion-dollar market in second-hand versions of its software. From a report: ValueLicensing buys pre-owned Microsoft software licences from companies that upgrade their IT or become insolvent, and then resells them across the UK and Europe. It claims on its website that its customers can save up to 70 per cent by buying used software, and points to one NHS Trust that allegedly saved $1.37 m by using Microsoft Office 2019, rather than the latest version of the office tools suite. Jonathan Horley, ValueLicensing's founder, accused Microsoft of harming competition in the used software market by persuading companies to relinquish their perpetual licences, often in exchange for discounts on Microsoft's cloud-based software, such as Office 365. "Microsoft has an incentive to move to its new cloud-based model and remove the old licences from the market so customers have no choice but to move to its subscription model," said Mr Horley, in an interview with the Financial Times.
IBM

IBM Creates a COBOL Compiler For Linux On x86 (theregister.com) 188

IBM has announced a COBOL compiler for Linux on x86. "IBM COBOL for Linux on x86 1.1 brings IBM's COBOL compilation technologies and capabilities to the Linux on x86 environment," said IBM in an announcement, describing it as "the latest addition to the IBM COBOL compiler family, which includes Enterprise COBOL for z/OS and COBOL for AIX." The Register reports: COBOL -- the common business-oriented language -- has its roots in the 1950s and is synonymous with the mainframe age and difficulties paying down technical debt accrued since a bygone era of computing. So why is IBM -- which is today obsessed with hybrid clouds -- bothering to offer a COBOL compiler for Linux on x86? Because IBM thinks you may want your COBOL apps in a hybrid cloud, albeit the kind of hybrid IBM fancies, which can mean a mix of z/OS, AIX, mainframes, POWER systems and actual public clouds.
[...]
But the announcement also suggests IBM doesn't completely believe this COBOL on x86 Linux caper has a future as it concludes: "This solution also provides organizations with the flexibility to move workloads back to IBM Z should performance and throughput requirements increase, or to share business logic and data with CICS Transaction Server for z/OS." The new offering requires RHEL 7.8 or later, or Ubuntu Server 16.04 LTS, 18.04 LTS, or later.

Intel

Intel Launches First 10nm 3rd Gen Xeon Scalable Processors For Data Centers (hothardware.com) 42

MojoKid writes: Intel just officially launched its first server products built on its advanced 10nm manufacturing process node, the 3rd Gen Xeon Scalable family of processors. 3rd Gen Xeon Scalable processors are based on the 10nm Ice Lake-SP microarchitecture, which incorporates a number of new features and enhancements. Core counts have been significantly increased with this generation, and now offer up to 40 cores / 80 threads per socket versus 28 cores / 56 threads in Intel's previous-gen offerings. The 3rd Gen Intel Xeon Scalable processor platform also supports up to 8 channels of DDR4-3200 memory, up to 6 terabytes of total memory, and up to 64 lanes of PCIe Gen4 connectivity per socket, for more bandwidth, higher capacity, and copious IO.

New AI, security and cryptographic capabilities arrive with the platform as well. Across Cloud, HPC, 5G, IoT, and AI workloads, new 3rd Gen Xeon Scalable processors are claimed to offer significant uplifts across the board versus their previous-gen counterparts. And versus rival AMD's EPYC platform, Intel is also claiming many victories, specifically when AVX-512, new crypto instructions, or DL Boost are added to the equation. Core counts in the line-up range from 8 — 40 cores per processor and TDPs vary depending on the maximum base and boost frequencies and core count / configuration (up to a 270W TDP). Intel is currently shipping 3rd Gen Xeon Scalable CPUs to key customers now, with over 200K chips in Q1 this year and a steady ramp-up to follow.

Microsoft

Microsoft is Now Submerging Servers Into Liquid Baths (theverge.com) 82

Microsoft is starting to submerge its servers in liquid to improve their performance and energy efficiency. A rack of servers is now being used for production loads in what looks like a liquid bath. From a report: This immersion process has existed in the industry for a few years now, but Microsoft claims it's "the first cloud provider that is running two-phase immersion cooling in a production environment." The cooling works by completely submerging server racks in a specially designed non-conductive fluid. The fluorocarbon-based liquid works by removing heat as it directly hits components and the fluid reaches a lower boiling point (122 degrees Fahrenheit or 50 degrees Celsius) to condense and fall back into the bath as a raining liquid. This creates a closed-loop cooling system, reducing costs as no energy is needed to move the liquid around the tank, and no chiller is needed for the condenser either. "It's essentially a bath tub," explains Christian Belady, vice president of Microsoft's data center advanced development group, in an interview with The Verge. "The rack will lie down inside that bath tub, and what you'll see is boiling just like you'd see boiling in your pot. The boiling in your pot is at 100 degrees Celsius, and in this case it's at 50 degrees Celsius."
Security

GitHub is Investigating Crypto-mining Campaign Abusing Its Server Infrastructure (therecord.media) 27

An anonymous Slashdot reader shared this report from The Record: Code-hosting service GitHub is actively investigating a series of attacks against its cloud infrastructure that allowed cybercriminals to implant and abuse the company's servers for illicit crypto-mining operations, a spokesperson told The Record today.

The attacks have been going on since the fall of 2020 and have abused a GitHub feature called GitHub Actions, which allows users to automatically execute tasks and workflows once a certain event happens inside one of their GitHub repositories. In a phone call today, Dutch security engineer Justin Perdok told The Record that at least one threat actor is targeting GitHub repositories where GitHub Actions might be enabled. The attack involves forking a legitimate repository, adding malicious GitHub Actions to the original code, and then filing a Pull Request with the original repository in order to merge the code back into the original.

But the attack doesn't rely on the original project owner approving the malicious Pull Request. Just filing the Pull Request is enough for the attack, Perdok said. The Dutch security engineer told us attackers specifically target GitHub project owners that have automated workflows that test incoming pull requests via automated jobs. Once one of these malicious Pull Requests is filed, GitHub's systems will read the attacker's code and spin up a virtual machine that downloads and runs cryptocurrency-mining software on GitHub's infrastructure.

Perdok, who's had projects abused this way, said he's seen attackers spin up to 100 crypto-miners via one attack alone, creating huge computational loads for GitHub's infrastructure. The attackers appear to be happening at random and at scale. Perdok said he identified at least one account creating hundreds of Pull Requests containing malicious code.

Privacy

Did Patient Health Information Leak Into GitHub's Arctic Code Vault? (healthitsecurity.com) 25

HealthITSecurity writes: The patient data from multiple providers appears to have been captured and subsequently leaked on the data repository GitHub Arctic Code Vault by third-party vendor MedData, according to a new collaborative report from security researcher Jelle Ursem and Dissent Doe of DataBreaches.net.

Through his research, Ursem detected troves of protected health information tied to a single developer... The databases were taken down on December 17. MedData recently released a notice that detailed the massive patient data breach, which involved information provided to the vendor for processing services... Officials discovered that an employee had saved files to personal folders created on the GitHub repository between December 2018 and September 2019, during their employment...

The impacted data included patient names combined with one or more data elements, such as subscriber ID,Social Security numbers, diagnoses, conditions, claims data, dates of services, medical procedure codes, insurance policy numbers, provider names, contact details, and dates of birth. All affected patients will receive free credit monitoring and identity protection services... This is the second report from Ursem and Dissent on GitHub repositories leaking patient data in the last six months. In August, they reported that at least nine GitHub repositories leveraging improper access controls leaked data from more than 150,000 to 200,000 patients. The data belonged to multiple providers.

The incidents highlight the importance of vendor management and the need to ensure security policies are aligned. Previous reports have shown about one-third of healthcare databases stored in the cloud, or even locally, are actively leaking data online. What's worse, misconfigured databases can be hacked in about eight hours.

DataBreaches.net wonders what happened after Med-Data reached out to GitHub about the vault's logs and removal of the code. Did GitHub provide the logs? If so, what did they show? Is anyone's Protected Health Information in GitHub's Arctic Code Vault? And if so, what happens? Will GitHub remove it...? Or will code just be left there for researchers to explore in 1,000 years so they can wade through the personal and protected health information or other sensitive information of people who trusted others to protect their privacy?

In November, 2020, Ursem posed the question to GitHub on Twitter. They never replied.

Government

Weather Service Internet Systems Are Crumbling As Key Platforms Are Taxed and Failing (washingtonpost.com) 111

An anonymous reader shares an excerpt from a Washington Post article, written by Matthew Cappucci and Jason Samenow: The National Weather Service experienced a major, systemwide Internet failure Tuesday morning, making its forecasts and warnings inaccessible to the public and limiting the data available to its meteorologists. The outage highlights systemic, long-standing issues with its information technology infrastructure, which the agency has struggled to address as demands for its services have only increased. In addition to Tuesday morning's outage, the Weather Service has encountered numerous, repeated problems with its Internet services in recent months, including: a bandwidth shortage that forced it to propose and implement limits to the amount of data its customers can download; the launch of a radar website that functioned inadequately and enraged users; a flood at its data center in Silver Spring, Md., that has stripped access to key ocean buoy observations; and multiple outages to NWS Chat, its program for conveying critical information to broadcasters and emergency managers, relied upon during severe weather events. The Weather Service is working to evaluate and implement solutions to these problems which are, in the meantime, impacting its ability to fulfill its mission of protecting life and property. [...]

Problems with the Weather Service's Internet systems have persisted for years, in part because of increasing demand from users, which the agency has struggled to meet. In December, because of an escalating bandwidth shortage, the Weather Service proposed limiting users to 60 connections per minute on a large number of its websites. Constituents complained about the quota and, earlier this month, the Weather Service announced it would instead impose a data limit of 120 requests per minute and only on servers hosting model data, beginning April 20. Meanwhile, on March 9, the Weather Service's headquarters in Silver Spring "experienced a ruptured water pipe, which caused significant and widespread flooding," which affected a data center, the agency said in a statement. "Some NWS data stopped flowing, including data from ocean buoys," the statement said, noting some of the buoys are used "to detect and locate a seismic event that could cause a tsunami."

Neil Jacobs, former acting head of the National Oceanic Atmospheric Administration, which oversees the Weather Service, said many of the agency's Internet infrastructure problems are tied to the fact they run on internal hardware rather than through cloud service providers such as Amazon Web Services, Microsoft and Google Cloud. "I've demanded in writing that NWS transition these applications to our Cloud partners. It's part of an internal strategy I've laid out," Jacobs, a Trump administration appointee, told the Capital Weather Gang in an email before he left office. In July, NOAA released its Cloud Strategy, which stated, "the volume and velocity of our data are expected to increase exponentially with the advent of new observing system and data-acquisition capabilities, placing a premium on our capacity and wherewithal to scale the IT infrastructure and services to support this growth. Modernizing our infrastructure requires leveraging cloud services as a solution to meet future demand."

Microsoft

Microsoft Wins US Army Contract for Augmented-Reality Headsets, Worth Up To $21.9 Billion Over 10 Years (cnbc.com) 55

The Pentagon announced that Microsoft has won a contract to build more than 120,000 custom HoloLens augmented-reality headsets for the U.S. Army. The contract could be worth up to $21.88 billion over 10 years, a Microsoft spokesperson said. From a report: The deal shows Microsoft can generate meaningful revenue from a futuristic product resulting from years of research, beyond core areas such as operating systems and productivity software. It follows a $480 million contract Microsoft received to give the Army prototypes of the Integrated Visual Augmented System, or IVAS, in 2018. The new deal will involve providing production versions.

The standard-issue HoloLens, which costs $3,500, enables people to see holograms overlaid over their actual environments and interact using hand and voice gestures. An IVAS prototype that a CNBC reporter tried out in 2019 displayed a map and a compass and had thermal imaging to reveal people in the dark. The system could also show the aim for a weapon. "The IVAS headset, based on HoloLens and augmented by Microsoft Azure cloud services, delivers a platform that will keep soldiers safer and make them more effective," Alex Kipman, a technical fellow at Microsoft and the person who introduced the HoloLens in 2015, wrote in a blog post. "The program delivers enhanced situational awareness, enabling information sharing and decision-making in a variety of scenarios."

Security

Ubiquiti Massively Downplayed a 'Catastrophic' Security Breach To Minimize Impact On Stock Price, Alleges Whistleblower (krebsonsecurity.com) 100

In January, Ubiquiti Networks sent out a notification to its customers informing them of a security breach and asking all users to change their account passwords and turn on two-factor authentication. "We recently became aware of unauthorized access to certain of our information technology systems hosted by a third party cloud provider," Ubiquiti said at the time. Now, according to Krebs on Security, a whistleblower "alleges Ubiquiti massively downplayed a 'catastrophic' incident to minimize the hit to its stock price, and that the third-party cloud provider claim was a fabrication." From the report: "It was catastrophically worse than reported, and legal silenced and overruled efforts to decisively protect customers," [the source] wrote in a letter to the European Data Protection Supervisor. "The breach was massive, customer data was at risk, access to customers' devices deployed in corporations and homes around the world was at risk."

According to [the source], the hackers obtained full read/write access to Ubiquiti databases at Amazon Web Services (AWS), which was the alleged "third party" involved in the breach. Ubiquiti's breach disclosure, he wrote, was "downplayed and purposefully written to imply that a 3rd party cloud vendor was at risk and that Ubiquiti was merely a casualty of that, instead of the target of the attack." In reality, [the source] said, the attackers had gained administrative access to Ubiquiti's servers at Amazon's cloud service, which secures the underlying server hardware and software but requires the cloud tenant (client) to secure access to any data stored there. "They were able to get cryptographic secrets for single sign-on cookies and remote access, full source code control contents, and signing keys exfiltration," [the source] said.

[The source] says the attacker(s) had access to privileged credentials that were previously stored in the LastPass account of a Ubiquiti IT employee, and gained root administrator access to all Ubiquiti AWS accounts, including all S3 data buckets, all application logs, all databases, all user database credentials, and secrets required to forge single sign-on (SSO) cookies. Such access could have allowed the intruders to remotely authenticate to countless Ubiquiti cloud-based devices around the world. According to its website, Ubiquiti has shipped more than 85 million devices that play a key role in networking infrastructure in over 200 countries and territories worldwide.
Instead of asking customers to change their passwords when they next log on, [the source] says Ubiquiti should've immediately invalidated all of its customer's credentials and forced a reset on all accounts, mainly because the intruders already had credentials needed to remotely access customer IoT systems.
Earth

Netflix Targets Net-Zero Carbon Footprint by End of 2022 (variety.com) 44

Netflix says it has a plan to hit net zero greenhouse gas emissions by the end of 2022, with a big part of the streaming giant's efforts aimed at operating more eco-friendly film and TV productions. From a report: The "Net Zero + Nature" plan was outlined Tuesday in a blog post by Emma Stewart, PhD, who joined Netflix as its first sustainability officer last fall. At Netflix, "we aspire to entertain the world," she wrote. "But that requires a habitable world to entertain." In 2020, Netflix estimates its carbon footprint was 1.13 million metric tons, down slightly from 1.31 million the year prior (mostly due to delayed content productions during the COVID-19 pandemic). Roughly 50% of that was generated by the physical production of Netflix films and series, including third-party projects licensed as Netflix-branded originals. Another 45% came from corporate operations (e.g. office space) and purchased goods (like marketing spend) and 5% was attributed to internet cloud providers like Amazon Web Services and Netflix's Open Connect content delivery network.

Netflix's Net Zero + Nature approach encompasses three steps: reducing emissions, aligning with the Paris Agreement's goal to limit global warming to 1.5C; investing in projects that prevent carbon from entering the atmosphere; and investing in projects that remove carbon. (Netflix says its goal of reaching net zero CO2 emissions is a higher standard than "carbon neutral," which doesn't require reductions in greenhouse gas emissions.) By 2030, Netflix is aiming to reduce direct and indirect greenhouse gas emissions (Scope 1 and 2 emissions) by 45%, in line with the guidance from the Science Based Targets Initiative, a partnership among CDP, the U.N. Global Compact, World Resources Institute (WRI) and the World Wide Fund for Nature (WWF).

Slashdot Top Deals