The contact tracing technology, which the two companies have been working on for a little over a month, was initially Tim Cook and Google head Sundar Pichai promised the technology would be .if someone they were in contact with over a 14-day period was diagnosed with the coronavirus. When the project was first announced, Apple CEO
The technology basically works by helping Apple iPhones or devices powered by Google’s Android software . They do this by sending signals to one another over Bluetooth radio that are stored on the phones. If someone is then confirmed as having the coronavirus, their phones send out a new signal alerting all the phones they’d come in contact with over the preceding 14 days.
Apple’s and Google’s efforts are just the latest ways big tech companies have been working to help fight the coronavirus, which has killed nearly 200,000 people around the world, and infected more than 2.7 million people.
Verily, the life sciences arm of Google parent company Alphabet, last month launched a website that gives people in California information about virus testing. The website, developed in partnership with the White House, lets people fill in symptoms and complete an online screener.
Google also last month said it’s committing more than $800 million to help small businesses and crisis responders dealing with the coronavirus pandemic. Apple and Google have both also begun making and distributing protective equipment for health care workers.
Now with this new coronavirus tracing technology, two of Silicon Valley’s biggest rivals are hoping to help create apps that’ll help us regain a sense of normalcy as we wait for a vaccine or other ways to fight the virus.
New privacy protections
Apple and Google said the technology will be opt-in, meaning it won’t be turned on by default. The companies will offer programming tools to developers in mid-May, allowing health authorities to build apps with this new technology. Then Apple and Google plan to offer software updates to the more than 2 billion active devices around the world using their software by the end of the year.
Apple said that includes any phone that can, the company’s latest software, which runs on devices as far back as , which was initially released in 2015.
The companies began discussing the project two weeks ago, sharing initial planning documents publicly to offer security researchers, partners and critics a way to begin vetting the technology.
To ensure further security, Apple and Google said they’d change the contact tracing program to use better encryption, scrambling any identifying information to ensure people cannot be tracked. The companies are also protecting any potentially identifiable information about a person’s phone, such as which model of phone they’re using or the signal strength of their transmissions.
Apple and Google are looking to health officials to build apps, the companies said, but they’ll also provide assistance. The companies said it’ll be easy to build an app for this project. And for health officials who don’t want to build their own, they’ll be able to use a premade app that can be rebranded.
Call it ‘exposure notification’
The companies are also changing the terminology they’re using, moving away from the widely used term “exposure notification,” saying it better describes the functionality of the program while the companies shift to emphasizing that the program is “privacy-preserving.”,” which could heighten anxieties of people worried about their privacy. Instead, they’re calling this system “
Whether Apple’s and Google’s software will ultimately win over people is still unclear. The companies admitted they don’t know the minimum number of people opting in that’s necessary for the system to be effective. Experts believe at least half the population would have to opt in, meaning the companies would need to convince potentially billions of people to sign up.
As part of their efforts to entice people, Apple and Google have promised to dismantle the system when the coronavirus crisis passes. That will include shutting down the application programming interface, or API, built to work with public health apps being built.
“The promise that Apple and Google will shut the API off is very welcome,” Jennifer Stisa Granick, the ACLU’s surveillance and cybersecurity counsel,. “We just want to make sure that this is something that’s verifiable, and that there will be an independent review to make sure the commitments they’ve made is something they’re living up to.”
Destiny 2’s first Fortnite-style live event was slow and underwhelming, but it’s a solid start – The Verge
Destiny 2’s first ever Fortnite-style live event just wrapped earlier today, and it was unfortunately not quite what some players were expecting. Instead of something monumental and game-changing for a key part of the game world, it was more of a slow burn alternative to a standard video game cutscene. And instead of delivering a strong narrative pay off on a season’s worth of otherwise dull and repetitive activity, the Almighty event ended without any meaningful change in direction or surprising new development.
For the last there months, developer Bungie has been building up a clash between the Destiny world’s artificial intelligence supercomputer Rasputin and a large planet-destroying ship called the Almighty. All of this past season’s activities have revolved around communicating with the AI character, a largely mysterious fixture in Destiny lore before this season, and doing a series of rehashed game modes and resource collecting in service of an eventual showdown between Rasputin and the Almighty. Over the last month, players were asked to participate in a mind-numbing number of public event activities to unlock an old Destiny 1 weapon and a brief story mission, with the promise of more to come at season’s end.
Some players expected the Almighty to actually crash into the game’s Tower social hub. Others expected a cutscene or perhaps some form of real-time space battle that would destroy or in some change the Tower. What we actually got was a severely understated form of the latter in which the image of the Almighty changed in excruciatingly slow fashion with new animations and, eventually, its destruction. Yet the whole execution felt a bit slapdash and underwhelming.
The event got off to a bumpy start when, at the scheduled 1PM ET start time, nothing appeared to happen. The delay, whether intentional or not, lasted for more than 20 minutes, but it did give players ample time to load into the Tower and to join in with any other collaborative antics other players were engaging in.
In my instance, a row of well-decorated Guardians laid down holographic staffs as if to make a last line of defense by using the “None Shall Pass” emote, a reference to Gandalf’s legendary line when facing down the Balrog in The Lord of the Rings.
Eventually, players began noticing subtle changes in the sky in the form of large clusters of laser beams arcing toward the Almighty. The process appeared to be dynamic, so the lasers grew closer over time, but at a painfully so pace that made it hard to track minute-by-minute movement.
The lasers made contact at around 1:50PM ET, nearly a full hour after the event supposedly first launched. (It’s unclear if there actually was an initial real at first or if Bungie purposefully designed it to be as slow-moving as it actually was.)
As we closed in on close to 90 minutes after the event first started, the ship began to explode in apparent slow motion. But that’s when the one exciting portion of the event kicked in, as the Almighty began crashing toward the surface of the Earth and bits of debris began flying off. This was the only actual dynamic part of the experience, as everything else felt like a series of subtle screenshot changes to in-game sky.
It ultimately ended with an exciting crash landing and a shock wave, and now the Almighty’s landing site appears to be a permanent fixture of the background of the Tower. If players end up inspecting some of the crash debris (after re-zoning in), Bungie is awarding an emblem.
Still, that the conclusion, which lasted under 10 minutes, required 90 minutes of build up and resulted in just an emblem and not much more illustrates the mismatched expectations Bungie may have inadvertently cultivated.
I think this is cool but you gotta give more of an indication of how much time people should budget. I figured this wouldn’t go past ten minutes
— Paul Tassi (@PaulTassi) June 6, 2020
We’ve never seen Bungie try something this ambitious with Destiny 2 before, and the end result was certainly exciting when you consider what could come next. The closest the studio has come is with 2018’s Forsaken expansion, in which the game’s new location, the Dreaming City, underwent a transformation after the very first raid team bested The Last Wish activity. But it was not at the scale of the Almighty event, and this is Bungie’s first real attempt at building a months-long narrative culminating in some form of shared experience for the player base.
Fortnite, although it’s best known for being a wildly popular battle royale game, has emerged over the last few years as an industry leader in what can only be described as live, simultaneous events. These are in-game events that happen in real time and are experienced only once by every single player who happens to log in and be present for the show. Fortnite isn’t the first of its kind to do this; massively multiplayer online games (MMOs) and online sims like Second Life have been experimenting in this department for years. But Fortnite features undeniably the most impressive and technically challenging versions of these events in all of gaming.
Starting with an in-game rocket launch back in 2018 and growing steadily more ambitious every few months with more complex and ever-evolving events like last fall’s black hole stunt, developer Epic has proven it has the technical chops to do what video games even just five years ago considered almost impossible. More recently, Epic held a stunning Travis Scott show that projected the rapper as a superhuman skyscraper-sized hologram for more than 12 million players and, last summer, concluded a multi-month storyline with a mecha-monster showdown, Pacific Rim-style.
Part of what makes Fortnite’s events so fun and feel so unprecedented is that they are so intricately built over time. Epic, through whatever technical achievements it’s built under the hood of its battle royale game (the developer has never shared how it pulls these events off), is able to change its map in subtle ways almost every day, adding clues to find and expanding teasers of larger events all without having to take down its server for maintenance. Some of its most successful feats have included a live event that then kicks players right into an all-new, changed game map, no update required.
Bungie’s approach is nowhere near that sophisticated, at least not yet. But the developer is trying something new and it’s clear the studio has taken ample notes from watching Fortnite. The Almighty grew ever-closer in the sky over the past season, and players logging into the Tower on Saturday morning noticed all the non-playable characters having shifted their positions to get a clear look at the ship’s descent. These changes were minor and it will be interesting to see if Bungie can step up its game for future live events, if it does indeed try them again.
Regardless of the overall quality in this Almighty event, experiments like these represent, for the first time, Destiny 2 living up to the series’ original promise of a shared, living, and ever-changing world. They help the game better straddle the line between shooter and MMO, even if it’s taken quite a few years and nearly two full games to get there.
[Indie Live Expo] That Tiny Spaceship Trailer + Xbox One News + Demo News & More – Gamasutra
[This unedited press release is made available courtesy of Gamasutra and its partnership with notable game PR-related resource GamesPress.]
Game Name: That Tiny Spaceship
Developer: We Make Small Games
Release Date: 2020 (PC) / TBA Console
Platforms: Windows PC & Xbox One
Press Contact: [email protected] | 403 – 970 -5653
Calgary Alberta, Canada – June 5th 2020 – We Make Small Games is proud to be included in the Indie Live Expo being held by Playism and Ryu’s Office that takes place beginning tomorrow (June 6th) at 5 A.M MST. Because most of you will probably be asleep at that time, please find enclosed all of our announcements and media ahead of time.
EMBARGO: June 6th 7 AM MST / 9 AM EST
In our press kit you will find: Copies of our game’s key, several screenshots, full press release in two languages, an HD version of our most recent trailer.
- Single player space shooter inspired by the coin-op games and early home console games of the 1980s.
- 4 missions – Beware obstacles and hazards lurk off screen!
- Synth-wave inspired soundtrack
- Colourful cast of characters who have enrolled in the D.R.O.O.L institute’s program to become licensed single-occupancy drone pilots.
- Visual Novel sections in-between playable missions that explore our pilot’s academic struggles, interpersonal relationships and personalities.
Things Are Really Starting To Heat Up
With today’s trailer release we’ve revealed footage from our game’s second module – which is set above a blazing hot sun. D.R.O.O.L candidates will find that their craft can only take so much heat from the sun before the ship’s integrity starts to fail. Candidates will have to keep their ship from falling apart while dodging oncoming meteors, solar flares and at the end of a stage a large boss ship that can warp around!
Meet Our First Pilot
Players will be able to chose between 4 different playable pilots who will appear during “modules” (missions) while piloting the titular Tiny Spaceship as well as during Visual Novel sections that take place between modules.
We’d like to introduce you to our first revealed pilot, Urani!
Age: 18 (Earth Years)
Urani comes from a small, flooded planet on the outer edges of the galaxy. As such, they are an exceptionally strong swimmer. Ever curious, they joined D.R.O.O.L in order to have an excuse to explore the universe with impunity.
Setting Course For Xbox One
A version of That Tiny Spaceship will be released for the Xbox One home console after the Windows PC version has launched on Steam. That Tiny Spaceship on Xbox One will be published via the Xbox Creators Program and will be accessible in that section of the Store once published. More details surrounding the console version of That Tiny Spaceship will be revealed at a later date.
Regarding Release Platforms
When we originally announced That Tiny Spaceship back in May 2018 we specifically announced two platforms in addition to Windows based PC – Apple’s Mac OS and Open Source Linux as target platforms for the game.
Many things have changed in the previous two years – within the game’s scope and in the realm of desktop gaming as a whole. In regard to MacOS, the latest release has introduced changes to application development and troubleshooting any potential issues would require development resources that we do not have at this time. In regards to Linux support, the sheer number of distributions available could make official support very difficult.
Several third party options exist for players on MacOS and Linux to attempt to play our game. Abstraction layers and recent changes to some digital distribution system could potentially make That Tiny Spaceship playable on platforms that will not be officially supported.
Demo To Be Released In Summer 2020
A free playable demo containing the first Visual Novel section and the first introductory module will be made available on Steam later this summer. This will give players an introduction to That Tiny Spaceship and the game-play that we have to offer.
Minimum System Requirements (Revision)
Our Minimum System requirements have been slightly adjusted from the original specs announced when our Steam page went live. With the addition of the Visual Novel sections we’ll be putting quite a bit more HD art into the TTS than originally planned. To accommodate this the space requirements have been increased by 1GB.
OS: Windows 7 / 8.1 / 10
Processor: Dual Core 2.0 GHZ
Memory: 2 GB RAM
Graphics: Intel HD 620 or Nvidia 950M / AMD HD 7970
DirectX: Version 11.0
Sound Card: On-board sound or stand-alone equivalent
“We Make Small Games” Logo, Branding and Zee Character © 2018 We Make Small Games “That Tiny Spaceship” Logo, Branding, Original Graphics and “That Tiny Spaceship” vector design © 2018 We Make Small Games.
©2018 Valve Corporation. Steam and the Steam logo are trademarks and/or registered trademarks of Valve Corporation in the U.S. and/or other countries. All rights reserved.
iOS 14 roundup: What we know before WWDC 2020 – 9to5Mac
With WWDC 2020 coming up on June 22nd, Apple will finally introduce the first beta version of iOS 14 to developers. This year, 9to5Mac had access to an early iOS 14 build that revealed some of the new features for Apple’s mobile operating system this year.
Read on for our full breakdown of iOS 14 leaks and rumors.
iOS 14 feature roundup:
Tweaked home screen
iOS 14 is expected to keep most of the design aspects from previous versions of iOS without a major redesign, but that doesn’t mean it won’t have any interface refinements.
Based on leaked code obtained by 9to5Mac, we found evidence that iOS 14 will include a new home screen page that allows users to see all of their application icons in a list view. The list view will include different sorting options to show only apps with unread notifications, recently opened apps and smart suggestions from Siri based on your daily usage.
This particular feature might be similar to the current Apple Watch app list view, but with advanced sorting options available.
Apple is also working on home screen widgets, 9to5Mac learned. Instead of pinned widgets like on iPadOS 13, the new widgets will be able to be moved around, just like any app icon. However, this feature seemed to be in a very early stage of implementation, and it’s possible Apple scraps it before public release.
iOS 14 will feature a redesigned wallpaper settings panel, which includes default wallpapers separated by collections, such as “Classic Stripes”, “Earth & Moon”, and “Flowers.” Instead of showing all wallpapers together, users will be able to scroll through each collection to find a specific wallpaper more easily.
Developers should be able to provide wallpaper collections and integrate them right into iOS Settings with a new Wallpaper API available to third-party apps. Users will also have the option to define a smart dynamic wallpaper that will only be used on the home screen. These dynamic wallpapers include a flat color, gradients, and a dark version based on the current wallpaper.
We discovered that it will be possible to set a custom wallpaper on CarPlay for the first time. Apple is testing this feature with the same default wallpapers from iOS 13, and they also automatically switch between light and dark versions.
With iOS 14, users will be able to receive alerts if the iPhone detects sounds like fire alarms, sirens, doorbells, and more. The system will translate these alerts into haptics for people who have hearing loss.
The camera will detect hand gestures to reproduce some specific tasks across the system, and code also points to a new “Audio Accommodations” accessibility feature which “can improve audio tuning over AirPods or EarPods for people with mild to moderate hearing loss.”
Immersive augmented reality
Apple is developing a new app internally referred as “Gobi” that will allow users to get more information about what they’re seeing around them through augmented reality. iOS 14 code reveals that Apple is testing its new AR system with Apple Stores and Starbucks, so people would be able to use the iPhone or iPad camera to learn more about a product.
These new AR features are supposed to be released to developers through ARKit, so they can create their own interactions with real environments.
More HomeKit controls
HomeKit is about to get a big update with iOS 14. The system will feature “Night Shift to Light” which essentially includes the ability to change the light temperature of compatible lamps during the day automatically, much like Night Shift does on iPhone, iPad, and Mac displays.
Apple is also expected to expand its HomeKit Secure Video system, which will be able to identify specific people on camera such as family members, so you’ll receive custom notifications.
The CarKey API has been under development since iOS 13.4, but the feature is expected to be introduced with iOS 14. 9to5Mac found out that CarKey lets users unlock, lock, and start a car using an iPhone or Apple Watch.
The pairing process will be done through the Wallet app with NFC-compatible cars, as users only need to hold the device near the vehicle to use it as a key. iOS code hints that Car Keys can be shared with other people, such as family members. Drivers can invite them also through the Wallet app to have access to the key on their own Apple devices.
iOS 14 internal files also reveal that BMW may be the first car maker to support Apple’s CarKey later this year. We expect to learn more about Apple’s partners when CarKey is officially introduced.
Apple Maps enhancements
Apple Maps will show more details about Apple Stores and hardware repair availability in the future. With iOS 14, users will be able to check the availability of Genius Bar services at each Apple Store directly from Apple Maps. The app will tell users, for example, whether a specific store offers screen and battery repairs for the same day.
In addition to that, Apple Maps will highlight places that have seating for couples, discount for children, private rooms, and movie theaters with IMAX sessions.
Advanced iCloud Keychain
For iCloud users who don’t want to subscribe for a paid password manager like 1Password, 9to5Mac found evidence that Apple is testing some major changes to iCloud Keychain on iOS 14. Users will be warned about reused passwords, so they can avoid using the same password on multiple sites for security reasons.
There will be a new method to save two-factor authentication passwords, so users will be able to log in on compatible sites using only the iCloud Keychain, without SMS, email, or other less secure methods.
Apple is working on a new way to offer specific parts of third-party apps across the system without needing to have them installed. Internally called Clips, this new API would developers to provide interactive and dynamic content from their apps even if users haven’t installed them.
The Clips API is directly related to the QR Code reader, so users will be able to can scan a code linked to an app and then interact with it directly from a card that will appear on the screen.
Developers will need to specify which part of the app should be downloaded by iOS as an Over-The-Air package to read that content. Apple is internally testing the Clips API with apps like OpenTable, Yelp, DoorDash, Sony, and YouTube.
Safari will get built-in translator features with iOS 14, allowing users to translate web pages without any third-party app or service. This feature should be automatically activated for web pages in different languages, and translations will be processed locally by the Neural Engine.
The translation option is also being tested with other apps, such as the App Store. In this case, iOS would translate app descriptions and reviews from users if these were written in another language.
More Apple Pencil tools
Apple added some new Apple Pencil related features on iPadOS 13 last year, and 9to5Mac learned that iPadOS 14 might include full support for Apple Pencil input on websites, making it compatible not only to scroll and touch but also to draw and markup with all its capabilities in Safari and other browsers.
Keyboard brightness shortcut
Apple’s Magic Keyboard and Smart Keyboard designed for iPad lacks function keys, which can be a downside as users need to access the Settings app or Control Center to change screen brightness or keyboard backlighting.
We also found evidence in code that suggests the existence of new keyboard shortcuts to change the brightness of the iPad screen or even the backlight of the keyboard. We believe that the brightness function keys will be adjustable just as modifier keys, but with shortcut combinations.
Although this is already being tested internally with the iPadOS 13.5.5 beta, we believe these new shortcuts might be introduced with iPadOS 14.
‘Find My’ alerts
Apple revamped Find My app last year with iOS 13, which now allows users to track lost devices and also share their location with family and friends. With iOS 14, Find My is expected to get another massive update, as Apple is planning custom alerts, AR mode, and more.
The updated app will include a new option to receive an alert when someone doesn’t arrive at a specific location at a scheduled time of day. The new alert options will also include being notified when a contact leaves a location before a set time, which can be useful for monitoring children.
Apple’s Find My app in iOS 14 will also work with augmented reality, as users will be able to locate a friend or a lost device visually using augmented reality for more precise directions from close locations.
If an iPhone or iPad stops working, it’s often necessary to restore the device’s firmware using a Mac or PC. But Apple is now testing a new feature called “OS Recovery” that will let users restore an iOS device directly over-the-air as well as by connecting it via USB to another iPhone or iPad, similar to how Apple’s Migration Tool works.
It should work just like the macOS Internet Recovery, which has been available for years and allows users to reinstall the operating system over the internet without needing another computer nearby.
Ready for new hardware
More than just new features, iOS 14 should be ready to work with Apple’s newest products likely to be announced this year. Evidence of the second-generation iPhone SE and the new 2020 iPad Pro models were found by 9to5Mac in iOS 14 code long before these products were launched, and there is more to come.
iOS 14 code also includes new details about changes to the Apple TV. Prior versions of tvOS 13 code have revealed that Apple is working on a new Apple TV box, but iOS 14 also includes the tidbit that there might also be a new Siri TV Remote.
Apple is yet to introduce its AirTag item trackers, but iOS 14 code revealed that it will have a user-replaceable battery instead of an internal rechargeable battery. Users will be able to attach an AirTag to any object to track it through Apple’s Find My app.
This early build of iOS 14 also revealed that Apple is testing new iPhone models with a time-of-flight sensor, which is likely the same LiDAR Scanner as iPad Pro. There are only two new iPhones in the code that are listed with three rear cameras plus the LiDAR Scanner, presumably the iPhone 12 Pro and the iPhone 12 Pro Max.
9to5Mac offered the first look at Apple’s over-ear headphones thanks to a glyph found in iOS 14. Based on the two versions of the same glyph, we expect Apple to offer at least two color options for the so-called “AirPods Studio”, likely black and white.
iOS 14 features: Device compatibility
A recent report suggests that iOS 14 will be compatible with all iPhone models that currently support iOS 13, ranging from iPhone 6s to iPhone 11 Pro Max. However, iPadOS 14 might drop support for iPad Air 2 and iPad mini 4, as these devices are powered by the A8 and A8X chips.
So these are the iOS devices that should be compatible with iOS 14:
- iPhone 6s and 6s Plus
- iPhone SE (1st generation)
- iPhone 7 and 7 Plus
- iPhone 8 and 8 Plus
- iPhone X
- iPhone XR
- iPhone XS and XS Max
- iPhone 11
- iPhone 11 Pro and 11 Pro Max
- iPhone SE (2nd generation)
- iPod touch (7th generation)
And the iPadOS 14 compatibility list:
- iPad (5th generation)
- iPad (6th generation)
- iPad (7th generation)
- iPad mini (5th generation)
- iPad Air (3rd generation)
- 12.9-inch iPad Pro
- 11-inch iPad Pro
- 10.5-inch iPad Pro
- 9.7-inch iPad Pro
iOS 14 wrap-up
Even without major design changes, iOS 14 is expected to bring several new features to enhance user experience. On the Home screen, iOS 14 features are likely to include widgets and a new list view.
Accessibility improvements are always welcome, and HomeKit enhancements will make the integration between devices even more seamless. The new AR system should prepare Apple’s operating system for its upcoming AR headset, and the CarKey API will certainly make life easier for those who own a car.
Keep in mind, however, that due to the COVID-19 pandemic, Apple’s plans may have changed and some features might be delayed or scrapped at all. Check out 9to5Mac’s guide for more details on everything we know about iOS 14.
FTC: We use income earning auto affiliate links. More.
Toddler could be battling rare syndrome in response to COVID-19 – Winnipeg Free Press
OPEC+ extends oil cuts in deal that hinges on end of cheating – BNNBloomberg.ca
NBPA approves 22-team format to resume NBA season – Sportsnet.ca
- Tech15 hours ago
Customers are reporting a bug in their iPhone 11's display – Pocketnow
- Health20 hours ago
Long-term care company fires executive after comments made during meeting – Toronto Sun
- Media22 hours ago
Saskatoon police Cst. placed on leave in connection with 'concerning' social media posts – CKOM News Talk Sports
- Health22 hours ago
Hydroxychloroquine 'useless' on COVID-19 patients, researcher says – CBC.ca
- News22 hours ago
'Safe restart' of Canadian economy will take 6-8 months, Freeland says – CTV News
- Politics19 hours ago
Trudeau takes a knee at anti-racism protest on Parliament Hill – CBC.ca
- Sports4 hours ago
Donald Trump says Drew Brees shouldn't have backed off flag comments – CBC.ca
- Politics23 hours ago
The politics behind how governments control coronavirus data – SaltWire Network