The layoffs will continue until (investor) morale improves
Since the start of 2023, more than 150,000 people have been laid off at tech companies, large and small. That’s a staggering number of people who have been put out of work.
When you think about how Meta, Amazon and Salesforce have handled these layoffs, the situation becomes even more grim.
Salesforce announced in January that it was laying off 10% of its approximately 80,000 employee workforce. Since then, it has been letting people go in dribs and drabs. Amazon also announced in January that it was laying off 18,000 employees, then announced another 9,000 this week. Meta laid off 11,000 in November and let another 10,000 people go in a second round this week. In addition, the company shut down another 5,000 open recs.
This, some would say, cruel, rolling approach to layoffs leaves employees anxious and uncertain about their own positions, while grieving about the loss of valued colleagues who have been let go.
Investors, on the other hand, seem to like layoffs as a way to move companies toward greater operating efficiency. CEOs typically are less concerned about the well being of their employees as they are in keeping investors happy.
An argument could be made, of course, that these companies overhired during the recent tech boom, and now it’s time to right size to better fit a changing market. That argument would carry more weight if the companies in question weren’t profitable. However, large American tech companies are very often both profitable and incredibly wealthy, even if their market cap has fallen from record highs.
While there is some truth to the idea that companies grew too quickly in recent years and need to reset, layoffs feel like the worst kind of short-term thinking: sacrificing employees to please investors. Are companies at least getting what they want from investors out of this devil’s bargain?
Investor response
If companies are looking to impress investors with their cost-cutting measures, we can rate how effective their layoffs are by how investors have reacted to them.
ChatGPT started a new kind of AI race — and made text boxes cool again
/ Who would have thought that typing into a chat window, on your computer, would be 2023’s hottest innovation?Tired: the metaverse. Wired: the message-verse? Image: OpenAI / David PierceIt’s pretty obvious that nobody saw ChatGPT coming. Not even OpenAI. Before it became by some measures the fastest growing consumer app in history, before it turned the phrase “generative pre-trained transformers” into common vernacular, before every company you can think of was racing to adopt its underlying model, ChatGPT launched in November as a “research preview.” The blog post announcing ChatGPT is now a hilarious case study in underselling. “ChatGPT is a sibling model to InstructGPT, which is trained to follow an instruction in a prompt and provide a detailed response. We are excited to introduce ChatGPT to get users’ feedback and learn about its strengths and weaknesses.” That’s it! That’s the whole pitch! No waxing poetic about fundamentally changing the nature of our interactions with technology, not even, like, a line about how cool it is. It was just a research preview.But now, barely four months later, it looks like ChatGPT really is going to change the way we think about technology. Or, maybe more accurately, change it back. Because the way we’re going, the future of technology is not whiz-bang interfaces or the metaverse. It’s “typing commands into a text box on your computer.” The command line is back — it’s just a whole lot smarter now.Really, generative AI is headed in two simultaneous directions. The first is much more infrastructural, adding new tools and capabilities to the stuff you already use. Large language models like GPT-4 and Google’s LaMDA are going to help you write emails and memos; they’re going to automatically spruce up your slide decks and correct any mistakes in your spreadsheets; they’re going to edit your photos better than you can; they’re going to help you write code and in many cases just do it for you.Remember when everybody, even Pizza Hut, was doing chatbots?This is roughly the path AI has been on for years, right? Google has been integrating all kinds of AI into its products over the last few years, and even companies like Salesforce have built strong AI research projects. These models are expensive to create, expensive to train, expensive to query, and potentially game-changing for corporate productivity. AI enhancements in products you already use is a big business — or, at least, is being invested in like one — and will be for a long time.The other AI direction, the one where interacting with the AI becomes a consumer product, was a much less obvious development. It makes sense now, of course: who doesn’t want to talk to a robot that knows all about movies and recipes and what to do in Tokyo, and if I say just the right things might go totally off the rails and try to make out with you? But before ChatGPT took the world by storm, and before Bing and Bard both took the idea and tried to build their own products out of it, I certainly wouldn’t have bet that typing into a chat window would be the next big thing in user interfaces.In a way, this is a return to a very old idea. For many years, most users only interacted with computers by typing on a blank screen — the command line was how you told the machine what to do. (Yes, ChatGPT is a lot of machines, and they’re not right there on your desk, but you get the idea.)But then, a funny thing happened: we invented better interfaces! The trouble with the command line was that you needed to know exactly what to type and in which order to get the computer to behave. Pointing and clicking on big icons was much simpler, plus it was much easier to teach people what the computer could do through pictures and icons. The command line gave way to the graphical user interface, and the GUI still reigns supreme.Developers never stopped trying to make chat UI work, though. WhatsApp is a good example: the company has spent years trying to figure out how users can use chat to interact with businesses. Allo, one of Google’s many failed messaging apps, hoped you might interact with an AI assistant inside chats with your friends. The first round of chatbot hype, circa about 2016, had a lot of very smart people thinking that messaging apps were the future of everything.There’s just something alluring about the messaging interface, the “conversational AI.” It starts with the fact that we all know how to use it; messaging apps are how we keep in touch with the people we care about most, which means they’re a place we spend a lot of time and energy. You may not know how to navigate the recesses of the Uber app or how to find your frequent flier number in the Southwest app, but “text these words to this number” is a behavior almost anyone understands. In a market where people don’t want to download apps and mobile websites mostly still suck, messaging can simplify experiences in a big way.Bing (and everybody else) is taking the chat interface and running with it. Image: MicrosoftAlso, while messaging isn’t the most advanced interface, it might be the most expandable. Take Slack, for instance: you probably think of it as a chat app, but in that back-and-forth interface, you can embed links, editable documents, interactive polls, informational bots, and so much more. WeChat is famously an entire platform — basically an entire internet — smushed into a messaging app. You can start with messaging and go a lot of places.But so many of these tools stumble in the same ways. For quick exchanges of information, like business hours, chat is perfect — ask a question, get an answer. But browsing a catalog as a series of messages? No thanks. Buying a plane ticket with a thousand-message back-and-forth? Hard pass. It’s no different than voice assistants, and god help you if you’ve ever tried to even buy simple things with Alexa. (“For Charmin, say ‘three.’”) For most complicated things, a visual and dedicated UI is far better than a messaging window.And when it comes to ChatGPT, Bard, Bing, and the rest, things get complicated really fast. These models are smart and collaborative, but you still have to know exactly what to ask for, in what way, and in what order to get what you want. The idea of a “prompt engineer,” the person you pay to know exactly how to coax the perfect image from Stable Diffusion or get ChatGPT to generate just the right Javascript, seems ridiculous but is actually an utterly necessary part of the equation. It’s no different than in the early computer era when only a few people knew how to tell the computer what to do. There are already marketplaces on which you can buy and sell really great prompts; there are prompt gurus and books about prompts; I assume Stanford is already working on a Prompt Engineering major that everyone will be taking soon.The remarkable thing about generative AI is that it feels like it can do almost anything. That’s also the whole problem. When you can do anything, what do you do? Where do you start? How do you learn how to use it when your only window into its possibilities is a blinking cursor? Eventually, these companies might develop more visual, more interactive tools that help people truly understand what they can do and how it all works. (This is one reason to keep an eye on ChatGPT’s new plug-ins system, which is pretty straightforward for now but could quickly expand the things you can do in the chat window.) Right now, the best idea any of them have is to offer a few suggestions about things you might type. AI was going to be a feature. Now it’s the product. And that means the text box is back. Messaging is the interface, again.
Pinterest brings shopping capabilities to Shuffles, its collage-making app
Pinterest announced today that it’s testing ways to integrate Shuffles collage content into Pinterest, starting with shopping. Shuffles, which is Pinterest’s collage-making app, launched to general public last November. To use Shuffles, users build collages using Pinterest’s own photo library or by snapping photos of objects they want to include with their iPhone’s camera. The iOS-only app is available in the U.S., Canada, Great Britain, Ireland, Australia and New Zealand.
Shuffles will now have all of the shopping capabilities as regular pins. Users will be able to tap individual cutouts used in collages, see the brand, price, and other product metadata along with similar products to shop.
“Unlike typical product exploration, Shuffles bring an interactivity that makes the experience inspirational and fun,” the company said in a blog post. “Gen-Z is curating fresh, relevant content alongside their peers, which is quickly making for a marketplace of trendy, shoppable ideas. The high density nature of Shuffles, which can include layers of product cutouts from multiple Pins, allows consumers to dig deeper and also connect to other Shuffles that include the same Pins. As we look ahead to how consumer behavior is evolving, we’re testing ways of integrating Shuffles collage content into Pinterest, starting with shopping.”
Although Shuffles surged to become the No. 1 Lifestyle app on the U.S. App Store in August when it was invite-only, the app’s popularity has since declined. By bringing shopping capabilities to Shuffles, Pinterest is likely looking for ways to retain users on the standalone app.
Image Credits: Pinterest
Pinterest also announced that it’s exploring a new takeover feature for advertisers called “Pinterest Premiere Spotlight” that prominently showcases a brand on search. The company says the feature is designed give advertisers a new way to reach users on Pinterest.
The company says 97% of top searches on Pinterest are unbranded, which means users typically don’t type a brand name into their searches on the platform. This gives brands the opportunity to be discovered as they help consumers go from discovery to decision to purchase, Pinterest says. In the coming months, the company planes to offer additional ways to help brands connect with shoppers.
Pinterest also shared some new stats about its Catalogs offering, which lets brands upload their full catalog to the platform and turn their products into dynamic Product Pins. The company says it has seen a 66% increase in retailers setting up shop by uploading or integrating their digital catalogs on its platform, along with 70% growth in active shopping feeds year over year globally.
As part of its most recent earnings release, Pinterest revealed that its platform now has 450 million monthly active users globally, a 4% jump year-on-year. Pinterest has been focused on enhancing the shopping experience on its platform over the past few years, and said during its earnings call that it wants to make every pin shoppable, including videos.
VR Is Revolutionizing Therapy. Why Aren’t More People Using It? – CNET
Sam Stokes, a New Zealand-based sales manager, isn't usually an anxious person. But there's one thing that, as he puts it, scared the shit out of him: needles. His aversion was severe enough to hold him back from getting routine tests. Stokes, now 40, recalls an instance in his 20s when he simply couldn't bring himself to get a blood test. He once even drove to the testing facility to get his blood drawn, but couldn't follow through with it. His partner (now wife) eventually convinced him to get the test, but he remembers it as one of "the most horrific" experiences he's had. "I kind of passed out a little bit along the way, and was sweaty and clammy and all that sort of stuff," he said. "I just absolutely hated the whole experience."When the COVID-19 pandemic arrived, he knew he couldn't let his needle phobia hold him back. Even watching the news became difficult, as stations regularly ran stories about vaccine developments."Every third story was an image of a needle, and that just freaked me out," he said.But by the time Stokes became eligible to get the vaccine, he didn't feel a pinch of anxiety. No clammy hands, no cold sweats. Nothing.Stokes overcame his phobia through virtual reality, a buzzy technology that industry giants like Meta and Sony believe could be the future of gaming and online socializing. With Apple expected to announce its first headset this year, 2023 could be a landmark moment for the technology.
How Razer Is Bringing Vibration ‘Soundtracks’ to Tomorrow’s Games and Movies – CNET
At GDC 2023, I sat down in gaming accessory company Razer's office and felt something I'd never experienced before: playing a video game and having my controller and headphones vibrate at different intensities that I could adjust to my liking. Then I watched a blockbuster superhero film with headphone vibration tuned to the action -- all powered by the same software.The software development kit, or SDK, created by tech studio Interhaptics, which was acquired by Razer last year, lets companies easily add vibration to their games, films and other media. Interhaptics founder Eric Vezzoli, now Razer's general manager of Interhaptics, walked me through a demonstration of what the software can do. He noted that the software takes just a day to be implemented into a game, and then vibration will be automatically added for any feedback device, be it a controller, smartphone, headphones, haptic vest or other device. Even if a developer is adding peripherals with different vibration frequency ranges, the software can add haptic feedback that's suited for each device. That simplifies the process when, say, trying to set vibration levels to be similar on iPhones and Android phones, which have very different vibration ranges."We take the designer's intention and we translate it to machine capability," Vezzoli said. The haptic composer software, as it's properly called, also puts vibration control in gamers' hands. In the game demo I played, I was able to toggle whether vibrations would happen when triggered by my character, enemies or the environment, as well as tone them down if they were too intense. The software put control of vibration feedback in my hands.The software SDK launched with support for PS4, PS5, Meta Quest 2 and X-input controllers, as well as iOS and Android phones. Developers can set up custom vibrations for potentially any number of different peripherals with haptics, allowing them to pulse or vibrate at different intensities to convey whatever emotion or action fits the game or movie scene.That list of peripherals includes the Razer Kraken V3 HyperSense headphones, which have haptic motors spread around both earcups and are the headphones I wore for the demo. While I was playing the simple dungeon-crawling game that Vezzoli and his team built to show off the SDK, every sword swing by my character pulsed vibration around my ears, while enemies hitting my character buzzed my ears in a noticeably different way. Then I watched scenes from films with headphone vibration coinciding with exciting moments -- buzzing along while a superhero used their powers, or, during a suspenseful silence, pulsing at a low frequency that subtly alternated between ears, like a heartbeat. If I'm being honest, it felt weird to have headphones buzzing around my ears with dynamic patterns -- the pitter-patter of heartbeats or triumphant vibrating bursts of superheroes clashing, which I'm used to hearing via sound effects, not feeling on my skin. But I could see how, if I were to get used to dynamic vibrations around my ears -- or with future devices, elsewhere on my body -- they could make entertainment more immersive. I remember discovering how much listening to footsteps made me better at finding enemies in first-person shooters, and dynamic vibrations about explosions or activity could similarly point me in the right direction. Movies and shows, which rely on visuals and soundscapes to convey tone and mood, could add a new layer with haptics -- and the technology seems ideally suited for VR developers to add texture to their immersive worlds.Razer and Interhaptics' software is admittedly a bit future-facing, since controllers and smartphones are far more common than vibrating headphones or other peripherals. But the company is sending out developer kits with the Razer Kraken V3 HyperSense headphones for developers to try adding the SDK software to their game. "It's a different type of experience, and we believe we can generate enormous value from a user experience playing these games," said Vezzoli.
‘X-Ray Vision’ Could Be the Next Superpower You Get With Augmented Reality – CNET
Now playing:
Watch this:
X-AR: Seeing Through Walls With Augmented Reality
4:41
X-ray vision might not be strictly for superheroes much longer, thanks to a new application of a familiar technology.Augmented reality offers a layer of virtual content that usually goes on top of the picture of the world taken in by our eyes, but researchers at MIT are using AR to help us see beyond barriers in a kind of "X-ray vision" they're calling X-AR.The flexible X-AR antenna on Microsoft's HoloLens.
MIT
X-AR is powered by a flexible antenna that adds a sort of "sixth sense" to the HoloLens, allowing it to locate specific objects that are outside the wearer's line of sight, as long as they're marked with a widely used item known as a radio frequency identification tag and less than 15 feet away.The prototype of this system is built as an add-on to Microsoft's HoloLens, but researchers say it could be applied to other augmented reality headsets down the line.X-AR being tested in a warehouse-like setting.
MIT
The research team developed their own HoloLens app that directs the wearer toward the desired object and alerts them whether or not the correct object has been picked up using the HoloLens' built-in hand tracking features.Potential applications for this technology right now include use in warehouses, shipping, retail and other places where RFID tags are commonly used. However, researchers say the potential applications of this sort of technology goes much further -- including determining whether food is safe to eat, and aiding search and rescue efforts in the event of a disaster.To watch X-AR in action and see our interview with members of the research team that created it, check out the video in this article.
Should You Upgrade to the iPhone 14? How It Compares to Older Phones – CNET
This story is part of Focal Point iPhone 2023, CNET's collection of news, tips and advice around Apple's most popular product.
The iPhone 14 lineup introduces a number of improvements, such as car-crash detection and a 48-megapixel main camera on the Pro models. That might make it tempting to upgrade, but doing so might not make sense for everyone. Above all else, the answer to whether you should upgrade depends on which phone you currently own. If you have a recent model like the iPhone 13 or maybe even 12, it would be wise to wait. If your phone is older though, it's worth figuring out what you stand to gain by jumping to a newer iPhone.The $799 (£849, AU$1,399) iPhone 14 brings modest improvements but not game-changing ones. Those changes include nitty-gritty camera improvements and the support for satellite-based emergency messaging. The iPhone 14 also has a new internal design, with simpler access to internal components, making it easier to repair than previous models. iFixit, a website that disassembles tech products and assesses how easy they are to fix, called it "the most repairable iPhone in years." And if you want these features in a larger size, the iPhone 14 Plus starts at $100 more, at $899.
Now playing:
Watch this:
iPhone 14 Review: A Decent Upgrade for Most
9:53
The iPhone 14 Pro and Pro Max's upgrades are more dramatic, but you still don't need to upgrade unless you can score a great trade-in deal. Apple saved its most interesting new features for the Pro lineup, including the Dynamic Island that replaces the notch, the new A16 Bionic processor and a 48-megapixel main camera sensor. It's important to remember that you don't have to buy the iPhone 14 to get camera, battery and performance improvements over an earlier iPhone. The recently discounted $699 iPhone 13 or the smaller $599 13 Mini could be a good option if you still want more storage, faster performance and an improved camera, especially if you're coming from a phone that's 3 years old or more. It's also the only option if you want the smaller Mini and its 5.4-inch screen, since the iPhone 14 line eliminates that size in favor of the new $899 iPhone 14 Plus with a 6.7-inch screen.
Purchasing decisions will always vary depending on budget, how well your phone works right now and your personal needs, so there's no simple answer that works for everyone. But here are the biggest differences between the iPhone 14 lineup and previous iPhone generations, to help you make a decision.
The iPhone 13.
Patrick Holland/CNET
iPhone 14 vs. iPhone 13, 13 Pro, 13 Pro MaxThe iPhone 14 lineup introduces new features such as car-crash detection, the removal of the physical SIM card for US phones, and enhanced cameras on the rear and front. Despite those changes, iPhone 14 isn't different enough to justify upgrading from the iPhone 13. And even though Apple finally got rid of the infamous notch in the Pro models, the 14 and 14 Plus still have one -- it's the same smaller notch that debuted on the iPhone 13 series. In fact, the iPhone 14 represents "one of the most minimal year-over-year upgrades in Apple's history," according to CNET's Patrick Holland, who reviewed Apple's latest phones. The iPhone 14 and iPhone 14 Plus have the A15 Bionic chip from last year's iPhone 13 Pro and iPhone 13 Pro Max. The 14's screen looks exactly like the one on the 13. Perhaps the most prominent change this year is the introduction of a larger version of the iPhone 14 called the iPhone 14 Plus, which has a 6.7-inch screen like the Pro Max. That means you no longer have to splurge on Apple's most expensive iPhone if you want the largest screen possible. Of course, the iPhone 14 is still highly rated, but we recommend saving your money and skipping the upgrade. If you've made up your mind to upgrade, we suggest going for an iPhone 14 Pro or iPhone 14 Pro Max if you can afford it. These phones are expensive, but buys you access to some salient changes -- namely a high-refresh rate display, Apple's new Dynamic Island multitasking bar, an always-on display as well as better cameras among other features.The bottom line: If you have an iPhone 13 or 13 Pro, don't upgrade. But if you are determined to get a new phone, go for the iPhone 14 Pro or 14 Pro Max, especially if you must have the Dynamic Island right now. Read more: iPhone 14 Pro and 14 Pro Max ReviewThe iPhone 12.
Sarah Tew/CNET
iPhone 14 vs. iPhone 12, 12 Pro Even though the iPhone 12 lineup was released two years ago, it still shares many similarities with Apple's latest phones. Both the iPhone 12 and iPhone 14 support 5G, run on fast processors, offer great cameras and include MagSafe accessory compatibility.Since the iPhone 14 is more of a refresh than a major upgrade, we recommend hanging onto your iPhone 12 if it's still in good condition. You can take advantage of the iPhone's latest software features such as lock screen customizations, widgets and the ability to unsend text messages by upgrading to iOS 16.The iPhone 14 received a few notable camera upgrades, like a larger sensor, a new lens with a faster aperture, improved photo processing and Action Mode which makes the movements in videos look smoother when you record them.But the iPhone 12's cameras remain excellent even though they are 2 years old. The iPhone 12 has a 12-megapixel dual camera system, while the iPhone 12 Pro includes a third camera with a telephoto lens. Check out our iPhone 12 review to see how the cameras held up when CNET put them through the paces.It's worth remembering that you get more noticeable upgrades with the iPhone 14 Pro and Pro Max. These include everything that's new in the 14, as well as an upgraded main camera with a larger 48-megapixel sensor, an ultrawide camera that allows you to take Macro photos and a third camera with a telephoto lens. If you can get a good trade-in deal that significantly knocks down the iPhone 14 Pro's price, upgrading from the regular iPhone 12 is a decent step-up. The bottom line: Hold onto your iPhone 12 for another year since the iPhone 14 isn't dramatically different. However, the iPhone 14 Pro and 14 Pro Max bring more significant changes that could be worthwhile if you can snag a good trade-in deal. Read More: All The "New" iPhone Features That Have Been on Android For YearsThe iPhone 11.
Angela Lang/CNET
iPhone 14 vs. iPhone 11, 11 ProIf you're using an iPhone 11, we recommend upgrading to an iPhone 14 (or even an iPhone 13). In the last three years, Apple has made enough changes to features including battery life, performance, screen quality, cameras and durability to merit buying a new iPhone. Upgrading to the iPhone 14 will get you 5G support, more storage (128GB at the base level versus 64GB) a better main camera with a wider aperture lens, new video shooting options like Action mode and Cinematic mode, a better selfie camera with Night mode and Apple's Photonic Engine processing, compatibility with Apple's MagSafe accessories, longer battery life and faster performance. That's in addition to car-crash detection and Apple's new emergency satellite messaging feature. Most of the photography and videography improvements are dramatic changes compared to the iPhone 11. And the longer battery life and additional storage space are welcomed upgrades that you'll notice on a daily basis.As previously mentioned, if you go for the 14 Pro instead, you get a new 48-megapixel main camera, a closer 3x optical zoom versus the 11 Pro Max's 2x zoom, the Dynamic Island instead of the notch and numerous other upgrades like an always-on display.
The bottom line: The iPhone 14 lineup includes enough changes to justify upgrading from the iPhone 11. But if your phone is still in good condition and you're satisfied with it, install iOS 16 and hold onto it for another year. The iPhone XS.
Josh Miller/CNET
iPhone 14 vs. iPhone XS, XS Max, XRIf you bought the iPhone XS, XS Max or XR at launch, that means your phone is roughly 4 years old and may be starting to feel sluggish. That alone makes a strong case for upgrading, but there's plenty more to gain. Compared to the iPhone XS, the iPhone 14 provides six hours of additional battery life (according to Apple's estimates). In addition to everything that's new in the iPhone 14 specifically, you'll also get other upgrades Apple has added to the iPhone over the past few years. Those include 5G support, more storage (again, you get 128GB versus 64GB), faster performance and a better camera. The iPhone XS generation lacks Night mode for taking clearer pictures in the dark, and it also doesn't have Deep Fusion, which is Apple's name for its image processing technique that improves detail and clarity in darker environments. The XS's front camera has a lower 7-megapixel resolution compared to the larger and newer 12-megapixel sensor on the iPhone 14. If you're upgrading from an iPhone XR, you'll also get an additional camera with an ultrawide lens for taking broader group shots for the first time. The iPhone 14 also has a larger 6.1-inch screen compared to the iPhone XS' 5.8-inch display (the iPhone XS Max has a 6.5-inch screen, while the XR's screen is also 6.1 inches). The design has also changed quite a bit over the past four years; newer models have flat edges, a slightly smaller notch, different finishes and a new "squircle"-shaped camera module that replaces the pill-shaped rear camera cutout. So your phone will not only feel more modern, but it'll look newer, too.
The bottom line: If you have an iPhone XS, XS Max or XR, it's definitely worth upgrading. You get a noticeable boost in camera quality, battery life and performance among other areas.The iPhone X.
James Martin/CNET
iPhone 14 vs. iPhone XThe iPhone X is about 5 years old, which means it probably feels slow and its battery life isn't what it used to be. With an iPhone 14, you'll notice a major upgrade in both categories, as well as design, improved durability, connectivity and camera quality. Let's start with performance. The iPhone X runs on a much older A11 Bionic chip that's now 5 years old, while the iPhone 14 runs on Apple's A15 Bionic processor. The iPhone 14 Pro and Pro Max run on Apple's newer A16 Bionic chip. Both new processors are way ahead of the A11 chip, which only has a two-core neural engine compared to the A15 Bionic's 16-core neural engine. The iPhone's neural engine powers tasks that rely on machine learning and artificial intelligence, which are becoming a bigger part of the iPhone experience. Things like app suggestions in the App Library and Apple's Translate app rely on machine learning to function, which indicates that the iPhone X may struggle to keep up with newer capabilities.The iPhone X also has a dual-lens camera similar to that of the iPhone XS, meaning it's missing the iPhone 14's camera hardware improvements in addition to Night mode, Deep Fusion and the ability to control depth-of-field and blur levels in Portrait mode. Like the iPhone XS, you're only getting a 7-megapixel front camera compared to a 12-megapixel selfie camera on Apple's newer phones.Apple's five-year-old iPhone also has shorter battery life, with Apple estimating it should last for 13 hours when playing back video compared to 20 hours on the iPhone 14. The iPhone 14's 6.1-inch screen is bigger than the 5.8-inch display on the iPhone X, and it should also be brighter since it can reach 800 nits of max brightness compared to the iPhone X's 625-nit screen.The iPhone 14 supports Dolby Atmos and spatial audio playback, while the iPhone X just has stereo playback. That's probably not a deal-breaker, but might be crucial if you watch a lot of video on your phone without headphones.And of course, there's the benefit of getting car-crash detection, Apple's new emergency SOS messaging via satellite option, better water resistance (up to 6 meters for 30 minutes versus 1 meter), 5G support, more storage space, Ceramic Shield for the display, a refreshed design and the option to use MagSafe accessories on the iPhone 14.
The bottom line: If you have the iPhone X, it's time to upgrade. The iPhone 14 will feel new in just about every way, from the camera to performance, battery life and the way it looks and feels. The iPhone 8 and 8 Plus.
Gabriel Sama/CNET
iPhone 14 vs. iPhone 8, 8 PlusThe iPhone 8 generation has Apple's legacy iPhone design, which is fitting for a phone that's now 5 years old. If you have an iPhone 8 and are considering an upgrade, many of the reasons to do that are the same as the reasons to upgrade from the iPhone X. The processor is getting old, which could make it harder to use newer iPhone features that rely on machine learning. The cameras are outdated and lack features like Night mode (the smaller iPhone 8 doesn't have Portrait mode either, since it only has one lens). By upgrading, you'll get more storage, significantly longer battery life, support for 5G connectivity and MagSafe accessories, too.But the biggest difference is in the iPhone 8's design, which is much more than just an aesthetic upgrade. Phones with Apple's more modern edge-to-edge screen trade Touch ID for Face ID, which lets you unlock your phone and authenticate payments just by looking at your device. If you prefer Touch ID over Face ID, especially since it's difficult to use Face ID while wearing a mask, you might want to at least consider upgrading to the $429 iPhone SE, since it has the same processor as the iPhone 13, 5G compatibility and plenty of photography improvements inside a similar body to the iPhone 8.Upgrading to the iPhone 14 has a noticeably large jump in display size and quality. Since newer phones like the iPhone 14 don't have a home button, there's more room for Apple to expand the screen without making the device feel cumbersome. The iPhone 14's screen is even larger than the iPhone 8 Plus' 5.5-inch screen despite the device itself feeling more compact. (And for more perspective, consider that the iPhone 13 Mini has a 5.4-inch display). If you go for the 14 Pro you get another big change: the Dynamic Island, which transforms the notch area into an area for viewing alerts, system notifications and apps running in the background like Spotify or Apple Music. From personal experience, switching from an iPhone 8 (which has a 4.7-inch screen) to the iPhone 12's 6.1-inch display makes reading, checking email and watching videos much more comfortable. The screen isn't only larger, but it's also more vibrant with better contrast since it uses an OLED display rather than LCD.
The bottom line: The iPhone 14 is a huge jump from the iPhone 8. Everything about this phone will feel fast and new: the much larger and bolder screen, Face ID, the speedier processor, its longer battery life and of course the substantially upgraded cameras. Of note however, if you really want to get a newer iPhone but keep the iPhone 8's design, trade up to the current 2022 iPhone SE.The iPhone 7 Plus and iPhone 7.
Sarah Tew/CNET
iPhone 14 vs. iPhone 7, 7 PlusIf you have an iPhone 7, it's time to upgrade. It is 6 years old, and it shows in everything from the processor to the camera and storage space. The iPhone 7 doesn't support iOS 16, providing even more incentive for acquiring a newer device. While we generally recommend choosing the iPhone 14 Pro over the iPhone 14 in most cases, coming from a phone this old, means you'll find plenty that's new in the iPhone 14. The iPhone 7 runs on an aging A10 Fusion processor, which doesn't even have a neural engine and is years behind Apple's latest technology. It has a single-lens camera without Portrait mode, while the 7 Plus has two cameras. But those cameras lack many modern features like Night mode and Portrait Lighting, which adds specific lighting effects to your portraits. Similar to the iPhone 8, the iPhone 7 series includes Touch ID and comes in either 4.7- or 5.5-inch screen sizes. But since the iPhone 7 is a year older than the iPhone 8, it's also missing wireless charging, which means you must plug it in to charge. If you've owned an iPhone 7 for several years, it's probably bursting at the seams since it has substantially less storage space. The entry-level iPhone 7 only came with 32GB of space, which is a quarter of capacity available on the cheapest iPhone 14. The iPhone 14 brings major gains in nearly every aspect. The standard model has a larger, bolder and brighter bezel-free 6.1-inch screen that still feels compact since it doesn't have a home button. It runs on Apple's A15 Bionic processor, which is better equipped to handle newer iOS features. And it has a drastically improved dual-lens camera with a larger main camera sensor and advanced features like the new Cinematic mode for video and Night mode. Plus, Apple's estimates indicate it'll offer seven hours of additional battery life during video playback, which is a huge bump.
The bottom line: If you're still holding onto your iPhone 7, there's no question that you're due for an upgrade. A better screen, compatibility with iOS 16, longer battery life and more advanced cameras are just a few of the gains the iPhone 14 has to offer over the iPhone 7. And similar to my recommendation with the iPhone 8, if you really want to keep the home button and save some money, consider the iPhone SE. It gives you more recent performance upgrades while keeping a similar phone style.
Make Your FaceTime Calls Sound Better With This Trick – CNET
FaceTime calls can be hectic, from ensuring you have a good connection so your video isn't choppy to making sure the camera angle isn't looking up someone's nose. But you can cut back on the chaos by enabling your iPhone's Voice Isolation feature in FaceTime.Voice isolation for FaceTime calls was introduced with the release of iOS 15 in 2021. The feature muffles background noises, like the sound of kids playing in a nearby room or construction outside your window, so others in the FaceTime call can hear you without interruptions. Unfortunately, you won't find the Voice Isolation feature in Settings. Instead, you have to be in a FaceTime call to enable it. But once you turn the feature on, it will stay on the next time you're in a FaceTime call.Here's how to activate Voice Isolation so people can hear you more clearly in FaceTime calls.How to enable Voice Isolation1. Start or join a FaceTime call.2. Swipe down from the top-right corner of your screen to access your Control Center.Mic Mode appears in your Control Center when you're in a FaceTime call.
Zach McAuliffe/CNET
3. Tap Mic Mode near the top-right corner of your screen. 4. Tap Voice Isolation.Pro tip: You don't have to interrupt a FaceTime call with family and friends to turn this feature on. You can call yourself on FaceTime and enable Voice Isolation following the steps above. How to enable Wide SpectrumIn Mic Mode, there's another feature alongside Voice Isolation called Wide Spectrum. While Voice Isolation muffles other noises around you in a FaceTime call, Wide Spectrum enhances the noises around you without affecting your voice. Wide Spectrum is good for conference calls over FaceTime or if you have a large group of people FaceTiming someone else. If you have a little brother or sister who moves away to college and you and your family want to wish them a happy birthday, Wide Spectrum can help you all sing happy birthday to them at once.Opening Mic Mode shows you Standard, Voice Isolation and Wide Spectrum options.
Zach McAuliffe/CNET
To enable Wide Spectrum, follow the steps above, but tap Wide Spectrum instead of Voice Isolation. Like Voice Isolation, Wide Spectrum will stay enabled the next time you make a FaceTime call. To disable either Voice Isolation or Wide Spectrum, follow the instructions above and tap Standard. This will return your microphone to its default setting.For more on your iPhone, check out the 13 iPhone features you might not know about and 22 iPhone settings you should change now.
Now playing:
Watch this:
iOS 16: Powerful Features You May Have Missed
6:28
Microsoft fixes reversible screenshot vulnerability on Windows
/ The security flaw could let hackers revert the edited portions of screenshots, potentially revealing private information that someone tried to crop or scribble out. Illustration: Alex Castro / The VergeMicrosoft has pushed an update to fix a screenshot editing vulnerability in Windows 10 and 11, as spotted earlier by Bleeping Computer. The security flaw, dubbed the “aCropalypse,” could let bad actors recover the edited portions of screenshots, potentially revealing personal information that had been cropped out or concealed.According to Microsoft, the issue (CVE-2023-28303) affects both the Snip & Sketch app on Windows 10 and the Snipping Tool on Windows 11. However, it only applies to images created in a very specific set of steps. That includes those that have been taken, saved, edited, and then saved over the original file, as well as the ones opened in the Snipping Tool, edited, and then saved to the same location. It doesn’t have any effect on the screenshots modified before saving them and also doesn’t impact screenshots that had been copied and pasted to, say, the body of an email or document.Microsoft first learned of the issue earlier this week. That’s when Chris Blume, the chair of the working group for the PNG image format, brought it to the attention of David Buchanan and Simon Aarons — the same security researchers who discovered the aCropalypse vulnerability affecting the Google Pixel’s Markup tool. This, similarly, lets hackers reverse the changes made to screenshots, making it possible to reveal the personal information in an image that someone thought they were hiding, whether by cropping it out or scribbling over it. You can download the latest updates for the affected apps on Windows by heading to the Microsoft Store, clicking Library, and then choosing Get updates. If you have automatic updates enabled, you should notice that the Snipping Tool should be set to version 10.2008.3001.0, while the Snip & Sketch tool will be version 11.2302.20.0. Just like the patch Google issued, Microsoft’s change won’t update the edited screenshots that had already been posted online, though, which could potentially leave thousands of screenshots on the web that bad actors can exploit.
Epic made a Rivian R1T demo to show off its latest Unreal Engine 5 tools
In 2020, Epic Games publicly demoed Unreal Engine 5 for the first time. Nearly three years later, gamers are still waiting for the tech to go mainstream. Outside of Fortnite and The Matrix Awakens, there aren’t any UE5 games you can play right now, and the first salvo probably won’t arrive until the end of the year at the earliest. None of that stopped Epic from showcasing the engine’s latest capabilities with a handful of new demos during its recent State of Unreal keynote at GDC 2023.[embedded content]
Arguably the most impressive one saw Senua’s Saga: Hellblade 2 developer Ninja Theory show off Epic’s new MetaHuman Animator. The tool promises to make realistic facial capture accessible to indie developers by allowing them to use an iPhone, instead of dedicated equipment, to capture facial performances. As you can see from the two demos Epic shared, the tool makes it possible to quickly and accurately transform a closeup video of an actor into something a studio can use in-game. Epic said the animator would launch this summer.
[embedded content]
Separately, Epic showed off some of the enhancements coming to Unreal Engine 5.2 with a demo that featured, of all things, a digital recreation of Rivian’s R1T electric truck. The EV turned out to be the perfect showcase for UE 5’s new Substrate shading system. The technology allows artists to create different shading models and layer them as they see fit. In the demo, Epic gave the R1T an opal body to show how Substrate can allow different material layers to interact with one another without creating lighting artifacts. The demo was also a showcase for Epic’s new set of Procedural Content Generation tools. They allow artists to create expansive, highly detailed levels from a small set of hand-crafted assets.
[embedded content]
If all goes according to plan, it won’t be much longer before the first slate of Unreal Engine 5 games arrive. Provided it’s not delayed again, Stalker 2: Heart of Chornobyl is slated to release this year. Lords of the Fallen and Black Myth: Wukong, two other UE5 projects, don’t have a release date yet but have been in development for a few years now.