Posted by Jeevan Jayaprakash

Image source: Indiatimes

In this issue, we look at rumours that suggest that AR is set to be a key feature of this year’s iPhone, Google’s machine learning show of strength and finally, the news that Foursquare is set to share its gold mine of location intelligence.

Have a great week,

Hi Mum! Said Dad.


UBS analyst suggests that augmented reality could take a central role in the next iPhone

We have previously brought you some interesting comments regarding AR that have been made by Apple CEO, Tim Cook. If your memory is a little hazy, here is a quick refresher of some of the things that he has said about AR over the course of the last year: “I regard it as a big idea like the smartphone. I think AR is that big, it’s huge”, “[AR] is a core technology, not a product per se” and last but not least, “Augmented Reality will take some time to get right, but I do think that it’s profound”.

That last comment in particular is somewhat of a nod towards Apple’s tendency of shipping features/technology only once they are confident that their implementation is the industry gold standard. This has given Apple the reputation of being a relatively late adopter of technologies (or maybe Google/Android device manufacturers just move ridiculously fast to satisfy its demanding user base of early adopters). For example, Android devices have beat the iPhone to NFC, fingerprint readers, mobile payments, wireless charging, curved displays and more recently to in-built AR capabilities (see Google Tango).

However, according to Steven Milunovich from UBS, it seems that Apple feels confident enough to go to market with its take on AR later this year with the new iPhone. Milunovich states that Apple apparently has over 1,000 engineers working on AR technology in Israel. Rumours suggest that Apple’s proposition will not be too different to Google’s Tango and will feature “moderate 3D mapping and possibly an AR software development kit” as a starting point. The former could herald an era of ‘3D selfies’ that can be rendered into games and apps whilst the latter, in typically Apple fashion, will allow for tightly controlled, third party AR content creation that will work in synchronised harmony with its hardware.

Apple’s intervention could lead to a profound change in the way we talk about AR.


Google announces machine learning API for recognising objects in video

Google has augmented its portfolio of cloud machine learning models with the addition of an API that will allow developers to parse and search videos. The Cloud Video Intelligence API will allow developers to search for nouns such as “human” and “dog” as well as verbs such as “swim” or “fly” in videos. The API is also able to identify when entities appear and is able to return every shot in which said entity appears. Prior to this, the contents of videos required manual tagging by humans, so this is quite a big deal.

According to Google, the API is largely targeted at large media organisations and consumer technology companies who are looking to build media catalogs or better manage crowd-sourced content. Fei-Fei Li, chief scientist of artificial intelligence and machine learning at Google Cloud, said “We’re beginning to shine light on the dark matter of the digital universe” during her demonstration of the technology on stage at the Google Next Cloud conference.

With the arrival of the Cloud Video Intelligence API, Google has added to a rather impressive toolbox of ‘ready to use’ machine learning models including Vision, Natural Language and Translation APIs.

Google are very much positioning themselves to go head to head with Microsoft and Amazon’s cloud businesses.

Image source: TechCrunch

Foursquare to share its ‘superpower’ with the world

Foursquare has spent years collecting location data from millions of people across the world — they have benefitted hugely from the trust that consumers have in the service they provide. This trust has meant that they have been able to glean some incredible insights such as when American start craving Filipino food and how store closures impact Americans’ shopping habits.

The proprietary technology that underpins this ability to understand where phones, and therefore, where people go, in the real world in real-time, is now being opened up to enterprises, marketers and software developers via a software development kit. The Pilgrim SDK allows its users to send messages based on people’s real world activity such as when they enter a store or when they are about to undertake a particular activity.

So, why is this a game changer? Well, there is no additional hardware (yes, beacons we are looking at you) or software required. The SDK also integrates with popular solutions such as Localytics and Urban Airship meaning rolling it out for live products should be painless. It also leverages Foursquare’s powerful Places database that features 93 million ‘Places Shapes’, which are essentially time-sensitive geofences based on signals like Bluetooth and Wifi (think of it is a blob that intelligently adjusts based on historical footfall data). It is worth mentioning that the Places database is used by the likes of Snapchat and Twitter.

We are certainly looking forward to having a play with this!

Image source: Foursquare Türkiye

Originally written as part of Hi Mum! Said Dad’s Weekly Digest.

If you would like to receive our Weekly Digest straight to your inbox, please drop us an email at hello@himumsaiddad.com.