Will be supporting Quora Siri soon

Trend 2020: iOS 14 with better stability and AR

Stephan Wiesend

With the upcoming iOS 14, Apple will deliver better stability and performance, but also new technologies in the field of location and camera.

EnlargeApple promises more stability for iOS 14.

The thirteenth floor or the airplane seat with the number thirteen is often avoided, even with iOS 13, Apple did not exactly succeed with the public. The thirteenth system version was one of those unsuccessful system updates where friends and acquaintances were advised to wait for the first update. For many users a big disappointment compared to the stable iOS 12, which still runs on many devices. The predecessor had fewer new functions but offered first-class stability and best performance. With iOS 14, Apple will do better.

Will the SE continue to be supported?

But which devices will still support the upcoming iOS 13? With iOS 13, the long-awaited end for iPhone 6 and iPhone 5s had come. Some owners of an iPhone 6s and iPhone SE get nervous with iOS 14. There will probably be a system update for these devices in 2020 as well, as Apple only stopped selling the iPhone 6s in India six months ago. Although the model appeared in 2015, it is still suitable for everyday use and is still available in stores in Germany. We are less sure about the iPhone SE, as the screen resolution is no longer up-to-date. This would be regrettable for many young iPhone users, as many SEs have now reached younger family members.

EnlargeIt could be tight for the iPhone SE.

Siri and iCloud

Almost taken for granted: iOS 14 will offer improved Siri and iCloud functionality, Apple had already made a lot of improvements in the background and has to catch up with competitors such as Google and Amazon. Apple is investing a lot of money in these areas, but the results will continue to improve, especially in the background, as before. You shouldn't expect big jumps. The results of Siri keep getting better every year, unfortunately many users who are disappointed by old Siri versions only notice this to a limited extent. It is not due to a lack of manpower: In the middle of last year it became known that Apple is expanding its research location in Seattle - just one of several locations. In addition to the existing 500 employees, a further 2,000 are to be added in the next few years, and a building with over 6 hectares will be rented. Some of the job advertisements are also revealing, with a striking number of developers being sought for areas such as Siri Maps, iCloud and machine learning. Last but not least, the cooperation between Siri and Maps will probably improve a lot in the next versions.

EnlargeApple seems to be investing heavily in the further development of Siri

Location: an underestimated function

The new iPhones came with a new wireless module that, to the surprise of many users, offers few functions - and may not be available until iOS 14. The iPhone 11 models were the first Apple products to receive a chip for so-called UWB, which goes by the name of U1. It currently only fulfills one rather exotic function: According to the product page for the iPhone 11, the position of another iPhone 11 in the room can be identified via ultra wideband and data can be sent to this device more quickly via airdrop. A so-called spatial awareness, which allows you to determine your position with an accuracy of just under ten centimeters. A suspicious amount of effort for a rather unimportant function. Why a separate chip for this? The reason for this is the special capabilities: the range is 290 meters and the location can be determined more precisely by a factor of a hundred than with Bluetooth or WLAN - it is also very energy-efficient and 50 times faster than GPS positioning. As Brian Roemmele points out on Quora, there is a suffix on the product page: “And that's just the beginning”. The possible uses are extremely extensive, as Roemmele explains in his very detailed article. The UWB technology can be used for numerous applications, but it is ideal for applications from the AR area and the long-awaited Apple Glasses, as it enables precise location determination - a weakness of current devices that laser and infrared sensors have at their disposal Pushes boundaries. According to Roemmele, areas of application are not only in indoor location, but also for AR / MR / VR applications of all kinds, automotive, drones and robotics. Collision warning for cars or even pedestrians would be an area of ​​application, a kind of personal radar system.

EnlargeApple names targeted AirDrop use as one of the tasks of the U1 chip

One should remember the huge sums that Apple is investing in the further development of Maps. Perhaps it will soon be possible to use ultra-wideband for tracking too? It would be interesting if the chip were also installed in other Apple devices, for example in Airpod or Airpod cases, iPads or the supposedly planned standalone devices. A lost Airpod would then be quickly located. There was a lot of speculation before the keynote about standalone tracking devices apparently planned by Apple - products similar to the Tile key finder - with which you could, for example, track pets or your luggage. Given the small chip's low power consumption, these location disks could be smaller than two inches.

With this technology, it is also possible to record your own home, i.e. to create a map of the area. It is possible that Apple will soon open the technology to developers in order to open up new possibilities for app developers - you may already be able to hear more about it here at WWDC 2020. According to information from Mark Gurman, the iPad Pro 2020 will come onto the market with another exciting innovation: support for depth measurement. Two camera sensors including a 3D system then enable the three-dimensional detection of rooms and objects. In principle, this works like the FaceID already supported by the iPad 2018 - but instead of a face, the system can also capture objects in a room and create a 3D map. This is especially important for augmented reality applications: With a conventional camera, only limited recordings are possible - the distance to an object can only be determined with the help of cameras. AR will remain a future technology in 2020, in which Apple has been investing heavily for years. According to Bloomberg, over 1000 developers under Mike Rockwell are supposed to be researching AR applications.

AR glasses from Apple are considered secure - albeit only for 2022 - here Apple could use UWB technology for functions such as augmented reality and mixed reality.

New camera technologies

The smartphone manufacturers have recognized the subject of photo and video as a good sales argument and are increasingly outdoing each other with new camera functions. Apple is at the forefront here and has just taken over another software company in this area with Spectral Edge. These are technologies that will probably only become interesting for the upcoming iPhones, but further improvements can already be expected for iOS 14 - both for the camera function itself and for the editing functions. Regrettable: Many new functions are less tied to the operating system than to hardware. As with the iPhone 11, the iPhone 12 will probably offer some exclusive camera functions.

Promise: a more stable system than iOS 12

For many users, however, the functionality of iOS is less important than the stability. Poor performance and short battery life are often the result of deficiencies in the system, for example when a crashing background tool drains the battery. The background for Apple iOS problems is not new: Apple has obviously had project management problems that a disappointed developer recently dissected mercilessly in an article. Apple apparently shows that it cannot go on like this and is said to have completely changed its software development. It was about time to do this: Shortly after the release of iOS 13.0, the iOS 13.1 update came, which added many functions. Apple's developers had apparently already realized at the WWDC in June of this year that iOS 13 would not be ready in time for the iPhone launch. Because of this, several functions were moved to iOS 13.1, Apple itself saw iOS 13.1 as the final version, which could measure up to iOS 12 in terms of quality - but this was not entirely true, iOS 13.1 also had many annoying problems.

Craig Federighi, responsible for iOS, has therefore changed the project management at Apple. According to Bloomberg, Federighi presented new measures at the kickoff meeting for iOS 14 - new at least for Apple. So-called flags in the software are intended to prevent faulty or not yet fully developed features from being delivered. There are also so-called “daily builds”, but flagged components are deactivated ex works - developers can, however, consciously activate them in order to check them in the beta test. These "daily builds" with as many new functions as possible had the result that Apple no longer had an overview of the state of the iOS and which features are stable and which are not. For the future update to iOS 14 Apple wanted to bring a lot of new functions, but according to Bloomberg it is already clear that some of them will be moved to iOS 15. The new system should bring many improvements in terms of the use of resources: The developers will concentrate more on functions that really have a chance of being finished. The result: a stable system in which the promised functions work.


With iOS 14, we expect improvements in the areas of stability and performance in particular. iOS 13 brought many new functions, but with iOS 14 Apple has to make a lot of improvements. It's a shame: some new functions such as location and AR will probably only be available to users of the latest iPhones or iPads.