×
Bug

OpenBSD Mail Server Bug Allowed Remotely Executing Shell Commands As Root (zdnet.com) 39

This week a remotely-exploitable vulnerability (granting root privileges) was discovered in OpenSMTPD (OpenBSD's implementation of server-side SMTP).

ZDNet notes that the library's "portable" version "has also been incorporated into other OSes, such as FreeBSD, NetBSD, and some Linux distros, such as Debian, Fedora, Alpine Linux, and more." To exploit this issue, an attacker must craft and send malformed SMTP messages to a vulnerable server... OpenSMTPD developers have confirmed the vulnerability and released a patch earlier Wednesday -- OpenSMTPD version 6.6.2p1...

The good news is that the bug was introduced in the OpenSMTPD code in May 2018 and that many distros may still use older library versions, not affected by this issue. For example, only in-dev Debian releases are affected by this issue, but not Debian stable branches, which ship with older OpenSMTPD versions.

Technical details and proof of concept exploit code are available in the Qualys CVE-2020-7247 security advisory.

Hackaday has a more detailed description of the vulnerability, while the Register looks at the buggy C code.

Interestingly, Qualys researchers exploited this vulnerability using a technique from the Morris Worm of 1988.
Encryption

Linus Torvalds Pulls WireGuard VPN into Linux 5.6 Kernel Source Tree (techradar.com) 51

"The WireGuard VPN protocol will be included into the next Linux kernel as Linus Torvalds has merged it into his source tree for version 5.6," reports TechRadar:
While there are many popular VPN protocols such as OpenVPN, WireGuard has made a name for itself by being easy to configure and deploy as SSH... The WireGuard protocol is a project from security researcher and kernel developer Jason Donenfeld who created it as an alternative to both IPsec and OpenVPN. Since the protocol consists of around just 4,000 lines of code as opposed to the 100,000 lines of code that make up OpenVPN, it is much easier for security experts to review and audit for vulnerabilities.

While WireGuard was initially released for the Linux kernel, the protocol is now cross-platform and can be deployed on Windows, macOS, BSD, iOS and Android.

Ars Technica notes that with Linus having merged WireGuard into the source tree, "the likelihood that it will disappear between now and 5.6's final release (expected sometime in May or early June) is vanishingly small." WireGuard's Jason Donenfeld is also contributing AVX crypto optimizations to the kernel outside the WireGuard project itself. Specifically, Donenfeld has optimized the Poly1305 cipher to take advantage of instruction sets present in modern CPUs. Poly1305 is used for WireGuard's own message authentication but can be used outside the project as well — for example, chacha20-poly1305 is one of the highest-performing SSH ciphers, particularly on CPUs without AES-NI hardware acceleration.

Other interesting features new to the 5.6 kernel will include USB4 support, multipath TCP, AMD and Intel power management improvements, and more.

Windows

iPad Launch Blindsided Windows Team, Reveals Former Microsoft Executive (twitter.com) 109

The launch of the iPad ten years ago was a big surprise to everyone in the industry -- including to Microsoft executives. Steven Sinofsky, the former President of the Windows Division at Microsoft, shares Microsoft's perspective as well as those of the other industry figures and press on the iPad: The announcement 10 years ago today of the "magical" iPad was clearly a milestone in computing. It was billed to be the "next" computer. For me, managing Windows, just weeks after the launch of Microsoft's "latest creation" Windows 7, it was a as much a challenge as magical. Given that Star Trek had tablets it was inevitable that the form factor would make it to computing (yes, the dynabook...). Microsoft had been working for more than 10 years starting with "WinPad" through Tablet PC. We were fixated on Win32, Pen, and more. The success of iPhone (140K apps & 3B downloads announced that day) blinded us at Microsoft as to where Apple was heading. Endless rumors of Apple's tablet *obviously* meant a pen computer based on Mac. Why not? The industry chased this for 20 years. That was our context. The press, however, was fixated on Apple lacking an "answer" (pundits seem to demand answers) to Netbooks -- those small, cheap, Windows laptops sweeping the world. Over 40 million sold. "What would Apple's response be?" We worried -- a cheap, pen-based, Mac. Sorry Harry!

Jobs said that a new computer needed to be better at some things, better than an iPhone/iPod and better than a laptop. Then he just went right at Netbooks answering what could be better at these things. "Some people have thought that that's a Netbook." (The audience joined in a round of laughter.) Then he said, "The problem is ... Netbooks aren't better at anything ... They're slow. They have low quality displays ... and they run clunky old PC software ... They're just cheap laptops." "Cheap laptops" ... from my perch that was a good thing. I mean inexpensive was a better word. But we knew that Netbooks (and ATOM) were really just a way to make use of the struggling efforts to make low-power, fanless, intel chips for phones. A brutal takedown of 40M units. Sitting in a Le Corbusier chair, he showed the "extraordinary" things his new device did, from browsing to email to photos and videos and more. The real kicker was that it achieved 10 hours of battery life -- unachievable in PCs struggling for 4 hours with their whirring fans.

There was no stylus..no pen. How could one input or be PRODUCTIVE? PC brains were so wedded to a keyboard, mouse, and pen alternative that the idea of being productive without those seemed fanciful. Also instant standby, no viruses, rotate-able, maintained quality over time... As if to emphasize the point, Schiller showed "rewritten" versions of Apple's iWork apps for the iPad. The iPad would have a word processor, spreadsheet, and presentation graphics. Rounding out the demonstration, the iPad would also sync settings with iTune -- content too. This was still early in the travails of iCloud but really a game changer Windows completely lacked except in enterprise with crazy server infrastructure or "consumer" Live apps. iPad had a 3G modem BECAUSE it was built on the iPhone. If you could figure out the device drivers and software for a PC, you'd need a multi-hundred dollar USB modem and a $60/month fee at best. The iPad made this a $29.99 option on AT&T and a slight uptick in purchase price. Starting at $499, iPad was a shot right across the consumer laptop. Consumer laptops were selling over 100 million units a year! Pundits were shocked at the price. I ordered mine arriving in 60/90 days.

At CES weeks earlier, there were the earliest tablets -- made with no help from Google a few fringe Chinese ODMs were shopping hacky tablets called "Mobile Internet Devices" or "Media Tablets". Samsung's Galaxy was 9 months away. Android support (for 4:3 screens) aways. The first looks and reviews a bit later were just endless (and now tiresome) commentary on how the iPad was really for "consumption" and not productivity. There were no files. No keyboard. No mouse. No overlapping windows. Can't write code! In a literally classically defined case of disruption, iPad didn't do those things but what it did, it did so much better not only did people prefer it but they changed what they did in order to use it. Besides, email was the most used too and iPad was great for that. In first year 2010-2011 Apple sold 20 million iPads. That same year would turn out to be an historical high water mark for PCs (365M, ~180M laptops). Analysts had forecasted more than 500M PCs were now rapidly increasing tablet forecasts to 100s of million and dropping PC. The iPad and iPhone were soundly existential threats to Microsoft's core platform business.

Without a platform Microsoft controlled that developers sought out, the soul of the company was "missing." The PC had been overrun by browsers, a change 10 years in the making. PC OEMs were deeply concerned about a rise of Android and loved the Android model (no PC maker would ultimately be a major Android OEM, however). Even Windows Server was eclipsed by Linux and Open Source. The kicker for me, though, was that keyboard stand for the iPad. It was such a hack. Such an obvious "objection handler." But it was critically important because it was a clear reminder that the underlying operating system was "real" ...it was not a "phone OS". Knowing the iPhone and now iPad ran an robust OS under the hood, with a totally different "shell", interface model (touch), and app model (APIs and architecture) had massive implications for being the leading platform provider for computers. That was my Jan 27, 2010.
Further reading: The iPad's original software designer and program lead look back on the device's first 10 years.
Security

Intel Is Patching Its 'Zombieload' CPU Security Flaw For the Third Time (engadget.com) 24

An anonymous reader quotes a report from Engadget: For the third time in less than a year, Intel has disclosed a new set of vulnerabilities related to the speculative functionality of its processors. On Monday, the company said it will issue a software update "in the coming weeks" that will fix two more microarchitectural data sampling (MDS) or Zombieload flaws. This latest update comes after the company released two separate patches in May and November of last year.

Compared to the MDS flaws Intel addressed in those two previous patches, these latest ones have a couple of limitations. To start, one of the vulnerabilities, L1DES, doesn't work on Intel's more recent chips. Moreover, a hacker can't execute the attack using a web browser. Intel also says it's "not aware" of anyone taking advantage of the flaws outside of the lab.
In response to complaints of the company's piecemeal approach, Intel said that it has taken significant steps to reduce the danger the flaws represent to its processors.

"Since May 2019, starting with Microarchitectural Data Sampling (MDS), and then in November with TAA, we and our system software partners have released mitigations that have cumulatively and substantially reduced the overall attack surface for these types of issues," a spokesperson for the company said. "We continue to conduct research in this area -- internally, and in conjunction with the external research community."
Wireless Networking

Some Vendors Are Already Releasing Chipsets That Support 6 GHz Wifi (anandtech.com) 39

Long-time Slashdot reader gabebear writes: The FCC hasn't officially cleared 6 GHz for WiFi, but chipsets that support 6 GHz are starting to be released. 6 GHz opens up a several times more bandwidth than what is currently available with WiFi, although it doesn't penetrate walls as well as 2.4 GHz.

Celeno has their press release and Broadcom has their press release. Still no news from Intel or Qualcomm on chipsets that support 6 GHz.

Businesses

Clayton Christensen, Father of 'Disruptive Innovation,' Dies At 67 (axios.com) 25

Clayton Christensen, the business scholar who coined the term "disruptive innovation," died of cancer treatment complications on Thursday at age 67. The Verge reports: You may not immediately recognize his name, but the tech industry -- and every resulting industry -- is built on the framework of technology disruption and innovation that Christensen devised. The crux of Christensen's theory is that big, successful companies that neglect potential customers at the lower end of their markets (mainframe computers, in his famous example) are ripe for disruption from smaller, more efficient, more nimble competitors that can do almost as good a job more cheaply (like personal computers). One need look no further than the biggest names in Silicon Valley to find evidence of successful disrupters, from Napster to Amazon to Uber to Airbnb and so on.

And scores of notable tech leaders have for years cited Christensen's 1997 book The Innovator's Dilemma as a major influence. It's the only business book on the late Steve Jobs' must-read list; Netflix CEO Reed Hastings read it with his executive team when he was developing the idea for his company; and the late Andy Grove, CEO of Intel, said the book and Christensen's theory were responsible for that company's turnaround. [...] He later refined his thinking on disruption, introducing the concept of "jobs to be done," which stressed the need to focus on customers' needs, and acknowledged that disruption was a great way to start a company, but not a good way to grow a company. "It's not a manual for how to grow or how to predict what customers want. [Jobs to be done] is the second side of the same coin: How can I be sure that competitors won't kill me and how can I be sure customers will want to buy the product? So it's actually a very important compliment to disruption."

AMD

AMD Launches Navi-Based Radeon RX 5600XT To Battle GeForce RTX 2060 Under $300 (hothardware.com) 57

MojoKid writes: Today AMD launched its latest midrange graphics card based on the company's all new Navi architecture. The AMD Radeon RX 5600 XT slots in under $300 ($279 MSRP) and is based on the same Navi 10 GPU as AMD's current high-end Radeon RX 5700 series cards. AMD's Radeon RX 5600 XT is outfitted with 36 compute units, with a total of 2,304 stream processors and is essentially a Radeon 5700 spec GPU with 2GB less GDDR 6 memory (6GB total) and a narrower 192-bit interface, versus Radeon RX 5700's 8GB, 256-bit config. HotHardware took a Sapphire Pulse Radeon RX 5600 XT around the benchmark track and this card has a BIOS switch on-board that toggles between performance and silent/quiet modes. In performance mode, the card has a 160W power target, 14Gbps memory data rate, a Boost Clock of 1,750MHz and a Game Clock of 1,615MHz. In silent/quiet mode, things are a bit more tame with a 135W power target, 12Gbps memory, and 1,620 MHz/1,460MHz Boost and Game Clocks, respectively. In the gaming benchmarks, the new Radeon RX 5600 XT is generally faster than NVIDIA's GeForce RTX 2060 overall, with the exception of a few titles that are more NVIDIA-optimized and in VR. Though it lacks the capability for hardware-accelerated ray tracing, the new AMD Radeon RX 5600 XT weighs in $20-30 less than NVIDIA's closest competitor and offers similar if not better performance.
Open Source

Tuxedo's New Manjaro Linux Laptops Will Include Massive Customization (forbes.com) 17

Tuxedo Computers "has teamed up with Manjaro to tease not one, not two, but several" Linux laptops, Forbes reports:
The Tuxedo Computers InfinityBook Pro 15...can be loaded with up to 64GB of RAM, a 10th-generation Intel Core i7 CPU, and as high as a 2TB Samsung EVO Plus NVMe drive. You can also purchase up to a 5-year warranty, and user-installed upgrades will not void the warranty...

Manjaro Lead Project Developer Philip Müller also teased a forthcoming AMD Ryzen laptop [on Forbes' "Linux For Everyone" podcast]. "Yes, we are currently evaluating which models we want to use because the industry is screaming for that," Müller says. "In the upcoming weeks we might get some of those for internal testing. Once they're certified and the drivers are ready, we'll see when we can launch those." Müller also tells me they're prepping what he describes as a "Dell XPS 13 killer."

"It's 10th-generation Intel based, we will have it in 14-inch with a 180-degree lid, so you can lay it flat on your desk if you like," he says.

The Manjaro/Tuxedo Computers partnership will also offer some intense customization options, Forbes adds.

"Want your company logo laser-etched on the lid? OK. Want to swap out the Manjaro logo with your logo on the Super key? Sure, no problem. Want to show off your knowledge of fictional alien races? Why not get a 100% Klingon keyboard?"
Desktops (Apple)

Low Power Mode for Mac Laptops: Making the Case Again (marco.org) 58

In light of this week's rumor that a Pro Mode -- which will supposedly boost performance on Macs with Catalina operating system -- may be coming, long time developer and Apple commentator Marco Arment makes the case for a Low Power Mode on macOS. He writes: Modern hardware constantly pushes thermal and power limits, trying to strike a balance that minimizes noise and heat while maximizing performance and battery life. Software also plays a role, trying to keep everything background-updated, content-indexed, and photo-analyzed so it's ready for us when we want it, but not so aggressively that we notice any cost to performance or battery life. Apple's customers don't usually have control over these balances, and they're usually fixed at design time with little opportunity to adapt to changing circumstances or customer priorities.

The sole exception, Low Power Mode on iOS, seems to be a huge hit: by offering a single toggle that chooses a different balance, people are able to greatly extend their battery life when they know they'll need it. Mac laptops need Low Power Mode, too. I believe so strongly in its potential because I've been using it on my laptops (in a way) for years, and it's fantastic. I've been disabling Intel Turbo Boost on my laptops with Turbo Boost Switcher Pro most of the time since 2015. In 2018, I first argued for Low Power Mode on macOS with a list of possible tweaks, concluding that disabling Turbo Boost was still the best bang-for-the-buck tweak to improve battery life without a noticeable performance cost in most tasks.

Recently, as Intel has crammed more cores and higher clocks into smaller form factors and pushed thermal limits to new extremes, the gains have become even more significant. [...] With Turbo Boost disabled, peak CPU power consumption drops by 62%, with a correspondingly huge reduction in temperature. This has two massive benefits: The fans never audibly spin up. [...] It runs significantly cooler. Turbo Boost lets laptops get too hot to comfortably hold in your lap, and so much heat radiates out that it can make hands sweaty. Disable it, and the laptop only gets moderately warm, not hot, and hands stay comfortably dry. I haven't done formal battery testing on the 16-inch, since it's so difficult and time-consuming to do in a controlled way that's actually useful to people, but anecdotally, I'm seeing similar battery gains by disabling Turbo Boost that I've seen with previous laptops: significantly longer battery life that I'd estimate to be between 30-50%.

Programming

'We're Approaching the Limits of Computer Power -- We Need New Programmers Now' (theguardian.com) 306

Ever-faster processors led to bloated software, but physical limits may force a return to the concise code of the past. John Naughton: Moore's law is just a statement of an empirical correlation observed over a particular period in history and we are reaching the limits of its application. In 2010, Moore himself predicted that the laws of physics would call a halt to the exponential increases. "In terms of size of transistor," he said, "you can see that we're approaching the size of atoms, which is a fundamental barrier, but it'll be two or three generations before we get that far -- but that's as far out as we've ever been able to see. We have another 10 to 20 years before we reach a fundamental limit." We've now reached 2020 and so the certainty that we will always have sufficiently powerful computing hardware for our expanding needs is beginning to look complacent. Since this has been obvious for decades to those in the business, there's been lots of research into ingenious ways of packing more computing power into machines, for example using multi-core architectures in which a CPU has two or more separate processing units called "cores" -- in the hope of postponing the awful day when the silicon chip finally runs out of road. (The new Apple Mac Pro, for example, is powered by a 28-core Intel Xeon processor.) And of course there is also a good deal of frenzied research into quantum computing, which could, in principle, be an epochal development.

But computing involves a combination of hardware and software and one of the predictable consequences of Moore's law is that it made programmers lazier. Writing software is a craft and some people are better at it than others. They write code that is more elegant and, more importantly, leaner, so that it executes faster. In the early days, when the hardware was relatively primitive, craftsmanship really mattered. When Bill Gates was a lad, for example, he wrote a Basic interpreter for one of the earliest microcomputers, the TRS-80. Because the machine had only a tiny read-only memory, Gates had to fit it into just 16 kilobytes. He wrote it in assembly language to increase efficiency and save space; there's a legend that for years afterwards he could recite the entire program by heart. There are thousands of stories like this from the early days of computing. But as Moore's law took hold, the need to write lean, parsimonious code gradually disappeared and incentives changed.

Intel

Intel's First Discrete GPU is Built For Developers (engadget.com) 50

At its CES 2020 keynote, Intel showed off its upcoming Xe discrete graphics chip and today, we're seeing exactly how that's going to be implemented. From a report: First off, Intel unveiled a standalone DG1 "software development vehicle" card that will allow developers to optimize apps for the new graphics system. It didn't reveal any performance details for the card, but did show it running the Warframe game. It also noted that it's now "sampling to ISVs (independent software vendors) worldwide... enabling developers to optimize for Xe." As far as we know right now, Intel's discrete graphics will be chips (not cards) installed together with the CPUs on a single package. However, it's interesting to see Intel graphics in the form of a standalone PCIe card, even one that will never be sold to consumers.
Intel

Thunderbolt 4 Arrives In 2020, But USB Will Remain the King of PC Ports (cnet.com) 161

Intel announced Thunderbolt 4 this week at CES, saying it will arrive in PCs later this year with Intel's new Tiger Lake processor. But, as CNET reports, "the all-purpose port won't be any faster at transferring data than the 4-year-old Thunderbolt 3." From the report: The chipmaker promised it would be four times faster than today's USB, then clarified it was talking about the USB 3.1 version at 10 gigabits per second. Thunderbolt 3, though, already can transfer data at 40Gbps. Still, you can expect other changes. "It standardizes PC platform requirements and adds the latest Thunderbolt innovations," Intel spokeswoman Sarah Kane said in a statement, adding that Intel plans to share more about Thunderbolt 4 later.

Thunderbolt, embraced first by Apple in 2011 and later by some Windows PC makers, has proved popular in high-end computing situations demanding a multipurpose connector. A single Thunderbolt port can link to external monitors, network adapters, storage systems and more. But Intel's years-long ambition to make Thunderbolt mainstream hasn't succeeded. Instead, USB remains the workhorse port.

AMD

AMD Unveils Ryzen 4000 Mobile CPUs Claiming Big Gains, 64-Core Threadripper (hothardware.com) 71

MojoKid writes: Yesterday, AMD launched its new Ryzen 4000 Series mobile processors for laptops at CES 2020, along with a monstrous 64-core/128-thread third-generation Ryzen Threadripper workstation desktop CPU. In addition to the new processors, on the graphics front the oft-leaked Radeon RX 5600 XT that target's 1080p gamers in the sweet spot of the GPU market was also made official. In CPU news, AMD claims Ryzen 4000 series mobile processors offer 20% lower SOC power, 2X perf-per-watt, 5X faster power state switching, and significantly improved iGPU performance versus its previous-gen mobile Ryzen 3000 products. AMD's U-Series flagship, the Ryzen 7 4800U, is an 8-core/16-thread processor with a max turbo frequency of 4.2GHz and integrated Vega-derived 8-core GPU.

Along with architectural enhancements and the frequency benefits of producing the chips at 7nm, AMD is underscoring up to 59% improved performance per graphics core as well. AMD is also claiming superior single-thread CPU performance versus current Intel-processors and significantly better multi-threaded performance. The initial Ryzen 4000 U-Series line-up consists of five processors, starting with the 4-core/4-thread Ryzen 5 4300U, and topping off with the aforementioned Ryzen 7 4800U. On the other end of the spectrum, AMD revealed some new information regarding its 64-core/128-thread Ryzen Threadripper 3990X processor. The beast chip will have a base clock of 2.9GHz and a boost clock of 4.3GHz with a whopping 288MB of cache. The chip will drop into existing TRX40 motherboards and be available on February 7th for $3990. AMD showcased the chip versus a dual socket Intel Xeon Platinum in the VRAY 3D rendering benchmark beating the Xeon system by almost 30 minutes in a 90-minute workload, though the Intel system retails for around $20K.

Microsoft

The Original Xbox Was Announced 19 Years Ago Today (gamerevolution.com) 51

On January 6, 2001, Bill Gates and The Rock debuted the original Xbox, calling it "the most electrifying" games console on the market. GameRevolution reports: The surreal image of The Rock standing alongside Gates, telling the billionaire "it doesn't matter what you think, Bill," was certainly a unique way to debut the console. We're glad Microsoft opted for this unusual route, though, because if it hadn't we wouldn't have video footage of The Rock discussing symmetric multiprocessing.

The original Xbox released on November 15, 2001. [It debuted with a 32-bit 733 MHz, custom Intel Pentium III Coppermine-based processor, 133 MHz 64-bit GTL+ front-side bus (FSB) with a 1.06 GB/s bandwidth, and 64 MB unified DDR SDRAM, with a 6.4 GB/s bandwidth, of which 1.06 GB/s is used by the CPU and 5.34 GB/s is shared by the rest of the system, according to Wikipedia.] Its high manufacturing cost would wind up costing Microsoft a lot of money, with the company losing $4 billion on the console. It would also fall short of its predicted 50 million sales, with Microsoft only shifting 24 million units by the end of its life cycle.
For comparison, the Xbox One X, which debuted on November 7, 2017, featured a SoC which incorporates a 2.3 GHz octa-core CPU, and Radeon GPU with 40 Compute Units clocked at 1172 MHz, generating 6 teraflops of graphical computing performance. It also includes 12GB of GDDR5 RAM with 9 GB allocated to games. Microsoft's next-generation Series X console is expected to deliver "four times the processing power of Xbox One X," although technical specs have yet to be announced.
Hardware

The Samsung Galaxy Chromebook is Beautiful, Fast, and Expensive (theverge.com) 31

An anonymous reader shares a report: The Samsung Galaxy Chromebook is one of the nicest pieces of laptop hardware I've touched in a very long time. Not since Google's 2017 Pixelbook has there been a ChromeOS device this good looking, this powerful, or -- here's the rub -- this expensive. Available sometime in the first quarter, the Galaxy Chromebook starts at $999 and could go much higher if you fully upgrade its RAM and storage. The central conceit of this laptop is that there really is demand for a high-end Chromebook, and while that may be more true in 2020 than it was in 2017, it's not a sure thing. Chrome OS still has a nagging inability to do some of the things you'd want a device that costs more than a thousand dollars to do: run full desktop apps, easily edit photos and video, or play more premium games.

Despite those limitations, Google and Samsung are looking for ways to get Chromebooks to escape the classroom and start appearing in boardrooms. The Galaxy Chromebook could be part of a revitalized effort to do just that. Running down the specs of the Galaxy Chromebook is like hitting a laundry list of the things you might want in a top-tier Windows ultrabook. It has a 13.3-inch 4K AMOLED display and an Intel 10th-gen Core-i5 Processor. There's a fingerprint sensor for unlocking, two USB-C ports, and expandable storage via microSD. The screen rotates 360-degrees and there's an included S-Pen stylus that can be stored in a silo on the device itself. It's built out of aluminum instead of plastic, has a large trackpad, and is less than 10mm thick.

United States

Is Big Tech Turning New York City Into America's Second Silicon Valley? (nytimes.com) 144

When Facebook decided it wanted to move 6,000 workers into Hudson Yards in New York, "existing tenants were told to move," reports the New York Times, as "part of a rush by the West Coast technology giants to expand in New York City." The growth in New York is occurring largely without major economic incentives from the city and state governments. Officials are mindful of the outcry last year over at least $3 billion in public subsidies that Amazon was offered to build a corporate campus in Queens... Amazon's announcement last month that it would lease space in Midtown for 1,500 workers renewed a debate over whether incentives should be used to woo huge tech companies to New York...

At Google's New York office, highly skilled workers now outnumber their colleagues in sales and marketing. Of the nearly 800 job openings that Amazon has in the city, more than half are for developers, engineers and data scientists. "Every line of business and every platform is represented quite healthfully," said William Floyd, Google's head of external affairs in New York, the company's largest office except for its Mountain View, Calif., headquarters. "Not everyone wants to be in California.'' Oren Michels, a tech adviser and investor who sold Mashery, a company based in San Francisco, to Intel in 2013, said that New York City had become a refuge for tech workers who did not want to be surrounded solely by those working in the same industry. "You have younger engineers and those sorts of people who frankly want to live in New York City because it's a more interesting and fun place to live," he said. "San Francisco is turning into a company town and the company is tech, both professionally and personally...."

Since 2016, the number of job openings in the city's tech sector has jumped 38 percent, an analysis for The Times by the jobs website Glassdoor found. In November, New York had the third-highest number of tech openings among United States cities, 26,843, behind just San Francisco and Seattle. It is not only the biggest tech firms that are growing in New York. From 2018 through the third quarter of 2019, investors pumped more than $27 billion into start-ups in the New York City region, the second most in that time for any area outside San Francisco, according to the MoneyTree Report by PwC-CB Insights. (Nearly $100 billion was invested in start-ups in the Silicon Valley area in that period....)

The major tech firms are expected to grow to the point that they are among the largest private tenants in New York in the coming years, rivaling longtime leaders like JPMorgan Chase.

Open Source

Linux Kernel Developers and Commits Dropped in 2019 (phoronix.com) 37

Phoronix reports that on New Year's Day, the Linux kernel's Git source tree showed 27,852,148 lines of code, divided among 66,492 files (including docs, Kconfig files, user-space utilities in-tree, etc).

Over its lifetime there's been 887,925 commits, and around 21,074 different authors: During 2019, the Linux kernel saw 74,754 commits, which is actually the lowest point since 2013. The 74k commits is compares to 80k commits seen in both 2017 and 2018, 77k commits in 2016, and 75k commits in both 2014 and 2015. Besides the commit count being lower, the author count for the year is also lower. 2019 saw around 4,189 different authors to the Linux kernel, which is lower than the 4,362 in 2018 and 4,402 in 2017.

While the commit count is lower for the year, on a line count it's about average with 3,386,347 lines of new code added and 1,696,620 lines removed...

Intel and Red Hat have remained the top companies contributing to the upstream Linux kernel.

Cellphones

Superphones, Hyperloops, and Other Tech Predictions That Haven't Happened (Yet) (tulsaworld.com) 39

Bloomberg looks back at what tech industry titans predicted would be happening "by 2020." - Here's what Huawei Technologies Co. said in 2015 predicting a "superphone" by 2020, according to ZDNet: "Inspired by the biological evolution, the mobile phone we currently know will come to life as the superphone," said Shao Yang, a strategy marketing president of Huawei. "Through evolution and adaptation, the superphone will be more intelligent, enhancing and even transforming our perceptions, enabling humans to go further than ever before." It's not entirely clear what that means, but it probably hasn't happened yet. In the interim, Huawei found itself in the middle of a trade war, and the Chinese company is focusing largely on mid-priced phones for its domestic market...

- In 2013, Elon Musk outlined his vision for a new "fifth mode of transportation" that would involve zipping people through tubes at speeds as fast as 800 miles per hour. Several tech entrepreneurs heeded Musk's call and went to work on such systems inspired by the billionaire's specifications. In 2015, one of the leading startups predicted a hyperloop spanning about 60 miles would be ready for human transport by 2020. Rob Lloyd, then the CEO of Hyperloop Technologies, told Popular Science: "I'm very confident that's going to happen." It hasn't. His company, now called Virgin Hyperloop One, has a 1,600-foot test track in California and hopes to build a 22-mile track in Saudi Arabia someday. Musk has since experimented with hyperloops of his own, and even he has had to scale back his ambitions. Musk's Boring Co. is building a so-called Loop system in Las Vegas, starting with a nearly mile-long track that consists of a narrow tunnel and Tesla cars moving at up to 155 miles per hour...

- It was barely two years ago when the maker of blowdryers and vacuum cleaners said it would sell an electric car by 2020. Dyson canceled the project this year, calling it "not commercially viable."

Other predictions include John McAfee's infamous 2017 prediction that one bitcoin would be worth $1 million by the end of 2020, "about three weeks before a crash would erase 83% of value over the next year."

And in 2012 Intel predicted that by 2020 we'd have computer chips that consumed almost no energy
Privacy

Ask Slashdot: What Will the 2020s Bring Us? 207

dryriver writes: The 2010s were not necessarily the greatest decade to live through. AAA computer games were not only DRM'd and internet tethered to death but became increasingly formulaic and pay-to-win driven, and poor quality console ports pissed off PC gamers. Forced software subscriptions for major software products you could previously buy became a thing. Personal privacy went out the window in ways too numerous to list, with lawmakers failing on many levels to regulate the tech, data-mining and internet advertising companies in any meaningful way. Severe security vulnerabilities were found in hundreds of different tech products, from Intel CPUs to baby monitors and internet-connected doorbells. Thousands of tech products shipped with microphones, cameras, and internet connectivity integration that couldn't be switched off with an actual hardware switch. Many electronics products became harder or impossible to repair yourself. Printed manuals coming with tech products became almost non-existent. Hackers, scammers, ransomwarers and identity thieves caused more mayhem than ever before. Troll farms, click farms and fake news factories damaged the integrity of the internet as an information source. Tech companies and media companies became afraid of pissing off the Chinese government.

Windows turned into a big piece of spyware. Intel couldn't be bothered to innovate until AMD Ryzen came along. Nvidia somehow took a full decade to make really basic realtime raytracing happen, even though smaller GPU maker Imagination had done it years earlier with a fraction of the budget, and in a mobile GPU to boot. Top-of-the-line smartphones became seriously expensive. Censorship and shadow banning on the once-more-open internet became a thing. Easily-triggered people trying to muzzle other people on social media became a thing. The quality of popular music and music videos went steadily downhill. Star Wars went to shit after Disney bought it, as did the Star Trek films. And mainstream cinema turned into an endless VFX-heavy comic book movies, remakes/reboots and horror movies fest. In many ways, television was the biggest winner of the 2010s, with many new TV shows with film-like production values being made. The second winner may be computer hardware that delivered more storage/memory/performance per dollar than ever before.

To the question: What, dear Slashdotters, will the 2020s bring us? Will things get better in tech and other things relevant to nerds, or will they get worse?
Businesses

Uber Joins Forces With Joby Aviation To Launch An Air Taxi Service By 2023 (theverge.com) 22

Uber is joining forces with California-based aerospace company Joby Aviation to launch urban air-taxi services in select locations by 2023. The Verge reports: Joby is the brainchild of inventor JoeBen Bevirt, who started the company in 2009. The company operated in relative obscurity until 2018, when Joby announced it had raised a surprising $100 million from a variety of investors, including the venture capital arms of Intel, Toyota, and JetBlue. The money helped finance development of the company's air taxi prototype, which has been conducting test flights at Joby's private airfield in Northern California.

Unlike the dozens of other companies that are currently building electric vertical take-off and landing (eVTOL) aircraft, Joby has kept much of its project under wraps. The few renderings that are out there show a plane-drone hybrid with 12 rotors and room in the cabin for four passengers, though a spokesperson previously cautioned that what Joby is working on now is "entirely new." The company has yet to provide any recent photographs or images of its prototype aircraft. [...] Uber says that it has signed a multiyear commercial contract with Joby to "launch a fast, reliable, clean and affordable urban air taxi service in select markets." Neither company disclosed the terms of the deal, nor would they comment on whether there was any money exchanged.
The report notes that Joby "will supply and operate the electric air taxies, and Uber will provide air traffic control help, landing pad construction, connections to ground transportation, and, of course, its ride-share network reconfigured to allow customers to hail flying cars rather than regular, terrestrial ones."

Slashdot Top Deals