Apple created the attention sinkhole. Here are some ways to fix it.

Your attention span is the battleground, and the tech platforms have you bested. Social media platforms, like Facebook, Twitter, Instagram get bulk of the blame for employing sketchy tactics to drive engagement. And they deserve most of the criticism; as Tristan Harris points out, as users, they are at a serious disadvantage when competing against companies trying to lure them with virtually endless resources.

However, one company that is responsible for this crisis goes relatively unscathed. Apple jumpstarted the smartphone revolution with the iPhone. Our phones are not anymore an extension of our brains but for many, a replacement. However, things went south. Your phone is less a digital hub, but more a sinkhole for your mind.

I believe that for having built a device that has demanded so much of our attention, Apple has left its users in the dark when it comes to using it for their own good. It has built a portal for companies to suck as much of our time as they demand, without giving us ability to protect ourselves. Surely, there have been some attempts to solve the problem, with features like Do Not Disturb and Bedtime, most of them have been half-assed at best. The market has tried to fill the void, but the OS restrictions render most efforts futile.

Currently, the iOS, the world’s most advanced mobile operating system as company calls it,  is built to serve apps and app developers. Apple should focus on its OS serving its users first, and the apps second.

1 · Attention

I have touched on this before, within the context of the Apple Watch, but I believe Apple has built a device that is so compelling visually, and connected to apps that literally have PhDs working to get you addicted to your, that the users are treated like mice in a lab pressing on pedals to get the next hit. This is unsustainable, and also irresponsible.

I believe Apple should give users enough data, both in raw and visually appealing formats to help them make informed choices. Moreover, the OS should allow people to limit their (or their kids’) use of their phones. And lastly, Apple should use technology to help users, if any to offset the thousands of people to trying to get them addicted.

1.1 · Allow Users to See where their Time Went

First of all, Apple needs to give users a way to see how much they spend on their phones, per app. There are clumsy ways to do this data. The popular  Moment does this literally inspecting the battery usage screen’s screenshot. The lengths developer Kevin Holesh went to make this app useful is remarkable, and application itself is definitely worth it but it shouldn’t be this hard. And it is not enough.

A user should be able to go to a section either on the Settings app, or maybe the Health app, and see the number of hours —of course it is hours— they have spent on their phone, per day, per app. If this data contains average session time, as defined by either the app being on the foreground, or in the case of iPhone X, looked at, even better. The sophisticated face tracking on the new iPhone can already tell if you are paying attention to your phone, why not use that data for good?

FaceID Demonstration
Paying serious attention

In an ideal case, Apple would make this data available with a rich, queryable API. This is obviously tricky with the privacy implications; ironically this kind data would be a goldmine for anyone to optimize their engagement tactics. However, even a categorized dataset, with app names discarded would be immensely useful. This way, users can see if they really should spending hours a day in a social media app. At the very least, Apple can share this data, in aggregate with public health and research institutions.

1.2 · Allow Time Based and Screen Time Limits for Apps

Second of all, Apple should allow users to limit time spent on an app, possibly as part of parental settings, or Restrictions, as Apple calls it. There is already precedent for this. Apple allows granular settings to disable things from downloading apps altogether to changing privacy settings, allowing location access and such.

Users should be able to set either duration limits per app (e.g. 1hr/day, 10hrs/week), time limits (e.g. only between 5PM and 8PM) or both. Either of these would be socially accepted, if not welcome. Bill Gates himself limits his kids’ time with technology, and so did Steve Jobs, and Jony Ive.. Such features should be built into the OS.

Steve Jobs and Bill Gates on stage
Low tech parents

As an aside, I think there are lots of visual ways to encourage proper app habits. Apps’ icons could slowly darken, show a small progress indicator (like when they are being installed), or other ways. This way, someone can tell that they have Instagrammed enough for the day.

1.3 · Make Useful Recommendations

With the new Apple Watch, and watchOS 4, Apple is working with Stanford to detect arrhythmia, by comparing current heart rate data, to that user’s known baseline. Since its inception,  Watch used rings, to encourage people to “stand up”, and move around. Even my Garmin watch keeps track of when I am standing still for too long.

Apple can do this for maintaining attention too. Next time you find yourself stressed, notice how you switch between apps, over and over again. Look at how people sometimes close an app, swipe around, come back to the same app just to send that one last text. These are observable patterns of stress.

Apple can, proactively and reactively, watch for these patterns and recommend someone to take a breather, maybe literally. With Watch, Apple went out of its way to build a custom vibration to simulate stretching on your wrist for breathing exercises. The attention to detail, and license to be playful is there. Just using on-device learning, Apple can tell when you are stressed, nervous, just swiping back and forth, and recommend a way to relax. Moreover, the OS can even see if the users’ sessions between apps are too short, or too long, make suggestions based on that kind of data.

Display on a Mercedes Car showing Attention Assist
Attention Assist, Indeed

As mentioned, there’s a lot of precedent for determining mental state using technology, and making recommendations. Any recent Mercedes will determine your fatigue based on how you drive, and recommend you take a coffee break. Many of GM’s new cars have driver facing cameras where the camera can tell your eyes are open and paying attention during self-driving mode. Using your phone is not as risky as driving a car, but for many, a phone is a much bigger part of your life.

2 · Notifications

Notifications on iOS are broken. With every iOS release, Apple tries to redo the notification settings, in a valiant effort to allow people to handle the deluge of pings. There are many notification settings hidden inside Settings app, with cryptic names like banners, alerts, and many more.

Apple Notification Guidelines
If only

However, currently all notifications from all apps are on a single plane. An annoying campaign update from a fledging app to re-engage you gets the same treatment as your mom trying to say hi. Moreover, apps abuse notification channels; the permissions are forever but the users’ interests are not. And of course, the data is sorely missing.

2.1 · Allow Users to See Data about Notifications and their Engagement

Again, this is a simple one. Apple should make data both the raw data as well as an easily digestible reporting about notifications available to a user. It is easy for this to get out of hand, but I think even a single listing where apps are ranked by notifications per week or day would be useful. Users should be able to tell that their shopping app they used once have been sending them notifications that they have been ignoring.

2.2 · Categorize and Group Notifications

Apple should allow smarter grouping of notifications, similar to email. Currently, as said, notifications largely have a single channel. However, this doesn’t scale. Tristan Harris and his group make a good suggestion; separate notifications by their origin. Anything that is directly caused by a user action should be separated from other notifications to start with. This would mean that your friend sending a message would be a different type of notification than Twitter telling you to nudge them.

I think there are even bigger opportunities here; without getting too much into it, Apple can help developers tie notifications to specific people, start categorizing them by intent. Literally anything, over what is currently available, would be an improvement.

This idea would definitely  receive a ton of pushback, especially from companies whose business relies on getting users addicted to their products. However, the maintaining toxic business models shouldn’t be a priority. If a user does not want to launch Facebook, then they shouldn’t have to. If an app can drive engagement, or whatever one might call mindlessly scrolling, only with an annoying push notification, maybe they shouldn’t be able to.

This is the kind of storm Apple can weather. While Apple cherishes its relationships with apps, it essentially is beholden primarily to its users. And such a change would almost certainly be welcome by users.

2.3 · Allow Short Term Permissions for Notifications

For many types of apps, notifications are only useful for a limited amount of time. When you call an Uber, or order food, you do want notifications but other times, an email would or a low-key notification would suffice. Users should be able to give apps a temporary permission to nudge them, and then the window should automatically close.

This is something some people are already familiar with. Many professionals, such as doctors, college professors, and lawyers have office hours when you can talk to them freely, but other times, you cannot.

2.4 · Make Useful Recommendations

Once again, Apple can even take a more proactive role and help users manage their notifications by making recommendations. For example, the OS can keep track of notifications one engages with meaningfully, or not. This way, the phone can ask the user if they would like to silence an app that they never use.

Apple already does this, to some degree with app developers; if you app’s notifications are too spammy, and users rarely engage, you’ll get a call. However, the users should have a say. An app that might be meaningful to a user might be spammy to other. The OS can make these decisions, or at least make smart recommendations. A feature like this literally exists to help you save space on your phone’s memory; why not for your notifications too?

Ending Thoughts

I believe that an attention based economy, where millions of people are in a constant state of distraction, with tiny short bursts of concentration is dangerous to our mental health as individuals, and society as a whole. Wasting hours switching between apps, not accomplishing anything is one thing, but  a constant need to be entertained, a lack of ability to be with one’s thoughts, not being able to just be around people, without pulling out a phone, are all going to cause wide social issues we’ll tackle with for years. When the people who have built these tools are scared, it’s a good sign that we lost control of our creations.

Surprisingly, iOS is lagging much behind Android in this aspect. I have almost exclusively used an iPhone since its launch, and written bulk of this piece without doing much research. I was surprised, and somewhat embarrassed to see most of what I proposed in the Attention section, such as bedtimes, app limits already exist in Android as part of Family Link. And of course, tools like RescueTime existed for Mac and Windows to help people see where their time went, but their functionality is next to useless in iOS. As mentioned, even Moments app can do only so much within the confines of Apple’s ecosystem.

I wholeheartedly think that unless we approach this issue like we did smoking, and elevate the discussion to a public health issue, it won’t get solved. However, there are ways to help curb the problem, and it is time Apple took the matter to its own hands.

Unlike most other tech companies, Apple makes most of its money by selling hardware to consumers. Every couple years, you buy an iPhone, and maybe an app or two, and Apple gets a cool thousand bucks,. Apple’s incentives, although recently less so with the increasing services revenue, lies with those of its users, not the advertisers or the marketers. If Apple is serious about its health focus, now is the right time to act.

With Big Data Comes Big Responsibility

It’s getting harder to suppress the sense of an impending doom. With the latest Equifax hack, the question of data stewardship has been propelled to the mainstream, again. There are valid calls to reprimand those responsible, and even shut down the company altogether. After all, if a company whose business is safekeeping information can’t keep the information safe,

what other option is there?

The increased attention to the topic is welcome but the outrage misses a key point. Equifax hack is unfortunate, but it is not a black swan. It is merely the latest incarnation of a problem, that will only get worse, unless we all do something.

The main issue is this: any mass collection of personally identifiable data is a liability. Individuals whose data is vacuumed en masse, the companies who do the vacuuming, and the legislators should become aware of the risks. It is fashionable to say “data is the new oil” but the analogy only goes so far, especially when you consider the current situation of the oil-rich countries. Silicon Valley itself here is especially vulnerable.

Big parts of the tech industry in Bay Area  is built on mass collection of such private data, and deriving some value from it. A significant part of the value comes from, somewhat depressingly, from the ever increasingly precise ad targeting. The problem with this model was long known, if not tacitly admitted by its creators, but it wasn’t until the Snowden revelations real national debate has picked up. With the recent brouhaha following the 2016 Elections, and a real risk of an authoritative government in the US, the questions are louder this time.

Public outcry does help, but the change is very slow. Part of it is the business models are wildly successful. Combined Alphabet (née Google) and Facebook are a trillion dollar duopoly. The cottage industry around these two companies, along with practically all stakeholders in the area being somewhat either beholden or financially tied to the industry, motivation to change is small.  Some companies, like Apple, try to raise the issue to a higher plane of morality, part for ethical reasons, part competitive. But the data keeps getting collected, at an ever increasing pace and it’s getting more and more likely a catastrophic event will occur.

Let’s first talk about how data gets exposed. Hacking, or unauthorized access is the most talked about but it’s far from the only way. A lot of the times, , it’s just a matter of a small mistake. Take Dropbox. A cloud storage company once allowed anyone to log into anyone else’s account by entirely ignoring a password check. The case was caught quickly, but it’s a dire reminder of small mistakes can happen. And that is a point worth pondering, separate the recent hack Dropbox suffered from.

As easily data is collected and stored, it’s even easier for it to change hands. Companies and their assets change hands, and so do the jurisdictions they live in. Russian tech sector is a prime example. Pavel Durov, the founder of the oddly popular instant messaging platform Telegram, first built VKontatke, a Russian social network site much popular than Facebook in the country,. But then came Russian government with demands of censorship. Durov ran away but the Russian social network is owned by a figure much closer to the government. And there’s always LiveJournal, which again got sold to a Russian company, now all its data under Russian jurisdiction.

And sometimes, the companies themselves open up themselves to being hacked. Once an internet darling, Yahoo! was put on spotlight when its own security team found a poorly designed hacking tool, installed by no other than company itself. Initially designed to track certain child pornography related emails for the government, the tool was built without the knowledge of the company’s Chief Security Officer, Alex Stamos, a well regarded security professional. He departed the company soon after, only to join Facebook. And again, this is just an addition to the Yahoo! hack that affected 1 billion users, and almost derailed multi-billion dollar acquisition.

Government surveillance is a touchy subject, and moral decisions are always fuzzy, with someone being unhappy. Governments should use tools at their disposal to keep their citizens safe, and this might sometimes require uncomfortable measures. This doesn’t mean they should be given a direct access to millions of people’s private, however. Intelligence efforts should be directed, not drag net. Living in a liberal democracy requires a certain amount of discomfort, not pure order.

But it is hard to deny the evidence at hand, from once liberal darlings like Turkey to known autocratic regimes like China, any government will find it impossible to resist the temptation to take a peek at the data, one way or another.

Governments are made up of people, just like corporations are. The solutions to these problems won’t be easy; with so much already built, tearing it all down is not an option, or even preferable. The industries built add value, employ thousands, if not millions. But we have to start somewhere, both as individuals, technology companies, and legislators.

First, individuals need to be more cognizant of their decisions about their data. Some of it will require education, from a much younger age. But even today, for many, there are a lot of easy steps one can take.

For many uses, a more private, less surveillance oriented tools already exist. Instant messaging tools like WhatsApp (once bought by Facebook for a whopping $19 Billion) is easy to use while using an cutting edge end-to-end encryption technology borrowed from Signal. One can wonder, if essentially playing spies is worth the hassle, but the risks are real, and getting more so every day even for congress people in the US.

For regular browsing, things are in worse shape. Practically every site on the internet tracks you across every other site, shopping and news sites are particularly bad. The users are fighting back, with sometimes clunky, equally overzealous tools. Thanks to an overzealous adoption of ads, both intrusive and sometimes malicious, ad-blocking is on the rise around the world. It is hard to fault consumers, most would benefit from using an independently owned Ad-Blocker like uBlock Origin, or using a browser like Brave that has such technology built in. Apple recently updated its browser Safari on both macOS and iOS to “intelligently” curb cross-site tracking.

For things like email, and cloud storage, things are trickier. For many users, their data is safer with a big company with a competent security team, as opposed to a smaller service provider. There’s a balance here; while big providers are much juicier targets (including governments who can request data legally), they also have the benefit of being hardened by such attacks. Companies like Google use their own services, further incentivizing them to safeguard data, at least from hackers.

However, even then, most people would benefit from increasing the security from the default values. For users of Gmail, Dropbox, and virtually any other cloud storage technology, using 2-Factor authentication, coupled with a password manager is a must.

And largely, going back the cognizance, individuals must be aware of the data they provide and be at least minimally informed. When you sign up for a new service, before sharing with them all your data, see if they at least have a way to delete it, or export it. Even if you never use either of those options, they can be good signs that company treats your data properly, instead of letting it seep into their machinery.

For creators of such technology, things are harder but there’s hope. First step is obvious; companies should treat personally identifiable data as liabilities and collect as little as possible, and only for a specific purpose. This is also the general philosophy behind EU’s new General Data Protection Regulation (GDPR) directive. Instead of collecting as much data as possible, hoping to find good use for it later, companies should only collect data, when they need to. And most importantly, they should delete the data, when they are done with it, instead of hoarding it.

Moreover, companies should invest in technologies that do not need collecting data at all, such using client side computation instead of server side. Apple is the prime example here; company uses machine learning models that are generated on the server, on aggregate data, for things like image recognition or speech synthesis on the devices themselves. Perhaps a sign of poetic justice, the intelligent cross-site tracking Apple built-in to its browser is based on data collected in aggregate form, instead of personally identifiable fashion.

It is not clear, if such technologies can keep up with a server-based solution where iteration is much faster, but the investments might pay dividends. Today’s smartphones easily compete with servers of just a few years ago in performance. Things will only get better.

And for times when mass collection of data is required, companies should invest in techniques that allow aggregate collection instead of personally identifying data. There are huge benefits to collecting data from big populations, and the patterns that emerge from such data can benefit everyone. Again, Apple is a good example here, though Uber is also worth mentioning. Both companies aggressively use a technique called differential privacy where private data is essentially scrambled enough to be not identifiable but still the patterns remain. This way, Uber analysts can view traffic patterns in a city, or even do precise analysis for a given time, without knowing any individual’s trips.

And more generally, companies should invest and actively work on technologies that reduce the reliance on individuals’ private data. As mentioned, a big ad industry will not go away overnight, but it can be transformed to something more responsible. Technologists are known for their innovative spirit, not defeatism.

End-to-end encryption is another promising technology. While popular for instant messaging, technology still in infancy for things like cloud storage and email. There are challenges; the technology is notoriously hard to use, and the recovery is problematic when someone forgets their encryption key, such as their password. Maybe most importantly, encryption makes the data entirely opaque to storage companies, severely limiting the value they can provide on top of it.

However, there are solutions, some already invented, some being worked on. WhatsApp showed that end-to-encryption can be deployed at massive scale and made easy to use. Other companies like Keybase work on more user-friendly ways to do group chat, and possibly storage, while also working on a new paradigm for identity. And there’s also more futuristic technologies like homomorphic encryption. Still in research phase, if it works as expected, technology might allow being able to build cloud storage services where the core data is private while still being able to be searched on, or indexed. Technology companies should direct more of their research and development resources efforts to such areas, not just better ways to collect and analyze data.

And lastly, legislators need to wake up to the issue before it is too late. The US government should enshrined privacy of individuals as a right, instead of treating as a commercial matter. Moreover, mass collection of personally identifiable data needs to be brought under supervision.

Current model, where an executive responsible for leaking 140M US consumers’ can get away with a slap on the wrist and $90M payday, does not work. Stronger punishment would help, but preventing such leaks at the source by limiting the size, fidelity, or the longevity of the data would be better.

Moreover, legislators should work with the industry to better educate the consumers about the risks. Companies will be unwilling to share details about what is possible with the data they have on their users (and unsuspecting visitors) but it is better for consumers to make informed decisions in the long run. Target made the headlines when it reportedly figured out a woman was pregnant before she could tell her parents. Customers should be able aware of such borderline creepy technology before they become subjects to it. Especially more so considering Target itself was also a victim of multiple major hacks. Facebook recently was the subject of a similar report where the company discovered a family member of a tech reporter (the same reporter who broke the Target story), unclear to everyone how. Individuals should not feel this powerless against corporations.

The current wave of negative press against Silicon Valley, caused mostly by the haphazard way social networks were used to amplify messages from subversive actors, is emotionally charged but is not wholly undeserved. Legislators can and should help technology companies earn back people’s trust, by allowing informed debate about their capabilities. A bigger public backlash, when it happens, would make today’s pessimism seem like a nice day in the park.

There are huge benefits to mass amounts of data. There is virtually no industry that wouldn’t benefit from having more data. Cities can make better traffic plans, medical researchers study diseases and health trends, governments can make better policy decisions. And it can be commercially beneficial too, with more data we can make better machine learning tools, from cars that can drive themselves to medical devices that can identify a disease early on. Even data that is collected for boring purposes can become useful; Google’s main revenue source selling ads on top of its search results, which no user would want to get rid of.

Data might be new oil, but only with mindful, responsible management of it will the future look like Norway, rather than Venezuela or Iraq. In its essence, personally identifiable data in huge troves is a big liability. And the benefits we derive from such data currently, is largely mostly used for things like better ad targeting. No one wants to go back to a time without Google, or Facebook. But it possible to be more responsible with the data. The onus is on everyone.

iPhone stole your attention. Your watch might help.

Apple announcements never fail to entertain. Over the years most amusing moments came to be when an Apple executive makes a comment about how their products not just contain amazing technology, but embody larger than life qualities. Couple years ago, when Apple removed the headphone jack from its phones, they called, without a hint of irony, “courage”. This year’s announcements had its share of squirming moments too, from Apple Town Squares to soul-sucking visualizations of face scanning technology. But for

me, the real kicker was when Apple decided to associate the Apple Watch with cellular connectivity with “freedom”.

It’s hard to not cringe, when you see Apple’s first promo video for the cellular Watch shows a surfer, who receives a call right in the middle of her sick trick. How is that a good thing? Do people not go on vacation to unplug? The eye rolls didn’t stop there; where Apple decided to demo making a phone call with nothing but a watch by showcasing an Apple executive answering a phone call, during a paddle-boarding session on Lake Tahoe. I wrote the proclamations of freedom via a $400 watch, combined with a $120/year bill hike, off as garden-variety Apple navel gazing.

It wasn’t until I read a review of the watch by Hodinkee, a high-end watch blogger, that the freedom Apple was promising was nothing more than freedom of its own device, the phone. It’s a great read overall, with lots of interesting insights into the industry itself. But what caught my eye was how the watch changed, or reduced how he used his phone.

In the few days I’ve been using the Series 3 Edition as my only communication device, I’ve found myself checking Instagram less. Texting less. Dickin’ around on the web less. I use the watch to text or make phone calls when I need to – and that’s it. My definition of “need” has changed completely – and frankly I don’t miss having my phone in my pocket at all.

The smartphone promised us always-on connectivity, and we welcomed it with open hands. The ability to respond to an email immediately wasn’t new, but add an actual web browser, and an App Store that extended the functionality of the phone virtually endlessly, we got hooked. As the fidelity of medium increased, it slowly became not just a device to use for a specific purpose, but something that we use, to more or less, to use. In short, we traded in our attention for the promise of always connectivity.

The reasons for how our phones are so addictive are numerous and we are just discovering the results, both personal and societal, of such an enormous shift in how we manage our attention spans. Although the research is taking shape, there are already a few loud voices telling us that the commodification of our attention is nothing less than a full-on scale war by the brightest minds of our generation against our identity.

I am not no Luddite; I earned my living for the past 7 years for working at technology companies. As I have moved across first cities, and then countries, I have relied on technology to stay connected to those that’s dear to me. I also think that technology is an essential tool to slowly bring down the arbitrary barriers in humanity, democratize access to information, and generally make the world a more just place.

Apple Watch here stands as an interesting device with the promise of a connectivity with a much smaller drag on one’s attention. It has a screen, but a much smaller one than the one on your phone; you simply can’t look at it for hours at end. The input methods to it are similar to a phone (with the notable exception of a camera) but voice plays a much bigger role on it, ironically, than it does on the phone. You can, realistically, use your watch via voice, both as an input and output method and only rely on the screen for an occasional glance.

Of course, the same dangers that made the smartphone an attention hog loom over the watch. Unlike a phone, a watch is always attached to your body, with an ability to jerk you at any time with a vibrating motor. And Apple is not being subtle about its goals; while it is admirable that the company is using the heart-rate sensor to detect heart conditions and generally provide data to researchers around the world, there’s something off-putting about your heart rate being measured constantly and uploaded, even in aggregate form, to some datacenter somewhere. And maybe, this will all be invalid when the tech industry actually puts is resources, unlike they’ve done so far, behind developing new apps for the watch that become as addictive as their phone counterparts.

It is early in our technological evolution to tell what will be the prevailing way we’ll be interacting with technology and for what purposes. Smartphones seem ubiquitous now but it’s important to note that they have existed for merely 10 years, a blink of an eye even on the fast changing pace of technology. It’s very unlikely and depressing that interacting with a 6 inch glass slate that is littered with apps whose raison d’être is to collect more data about you to sell better ads, is the conclusion of human-computer interaction.

In some way, Apple’s proclamation of freedom that you can get with a watch is an admission of this guilt. What the watch promises is a freedom from your phone. More than any company, Apple itself created this world where we feel a compulsive desire to be entertained and not be bored. And maybe, with the watch, Apple can help undo some of the damage. This is not to suggest that the main reason Apple sells devices is to advance the human civilization, or to not make unfathomable amounts of money, only to spend it on absurd buildings or ask for salvation from a giant corporation for our sins.

Unlike many of the other tech giants, Apple makes most of its money (though increasingly not all of it) from directly selling products to its customers. Without other intermediaries to take a cut, the company’s incentives are more directly aligned with those of its users. And more than that, with its size and reach, Apple is a company that sets the tone for the industry.

Our mode of interaction with our technology is still evolving. It is not reasonable to roll back to a world where always-on connectivity isn’t the norm. But that doesn’t mean that our attention should be up for sale. A device, or a combination of devices, that makes a conscious effort to be less in your face and more out of your way is one way to ensure that.