The Radical Evolution of Data Privacy in Push Notifications
Since the advent of the smart device, information on our identity, location, interests, beliefs, and day-to-day schedule has been more accessible than ever. While this has led to greater convenience on the user’s end, as well as new profitability models for businesses, it has also generated new risks. In many ways, our understanding of the potential uses for our personally identifiable data, or PII, is evolving faster than technology and legal institutions can keep up. In the realm of mobile engagement in particular data privacy is still akin to the Wild West. It’s easy to forget just how much data we generate on any given day. We want the convenience of real-time, personalized notifications with our apps, but we don’t want companies to engage in unethical or invasive behavior with the information we give them. Where is the solution?
PII and the Risks of Data Privacy Breaches
Known as personal data, or personally identifiable information (PII), the digital information we provide goes far beyond what personal information used to entail: name, address, phone number, social security, and so on. Now, PII can range from our:
- Biological information
- Likes and dislikes
- Political or religious ideals
- Virtually any information we enter into our devices, or even information our devices track without our noticing.
Yes, this means we get shown ads that are tailored to our interests. We can drop a pin and get Deliveroo brought to us on the go. But there are also some hefty drawbacks. It can be difficult to stay on top of data privacy breach scandals perpetrated by big tech companies – Facebook and TikTok are repeat offenders. Facebook’s past behavior has been so unsavory that every new feature they offer – such as mobile dating, or Preventative Health – seems to be a ploy to get their hands on even more of our PII. Data privacy is an especially fraught area when it comes to mobile marketing and push notifications.
We have barely scratched the surface of personal data as a digital resource. The sale, analysis, and processing of data has given rise to the attention economy. This is a model where algorithms drive us toward content and advertising that we are most likely to engage with. There are many reasons why this can be a positive influence. For example, a vegetarian would likely get value out of an ad for a book entitled “50 Delicious Ways to Cook Tofu.” Not so much an ad trying to sell them tickets to a hot dog festival. However, when it comes to unethical behavior – such as using voter demographics and preferences to influence elections – it becomes clear that we need to regulate who can access our data and what they can do with it.
GDPR, HIPAA, and COPPA
2018 was the year that sparked government institutions beginning to rethink the way we approach data privacy. The European Union’s General Data Protection Regulation (GDPR) was proposed in 2016, and went live on May 25, 2018. Since then, all websites and apps working under the EU’s jurisdiction must be transparent about how they collect and process data. Companies must gain the consent of consumers before collecting their data. They also must provide clear information on their websites about what any data collected is used for, and any third parties who receive it. Article 8, known as GDPR-K, covers the treatment of personal data for users younger than 16 years old. This includes the digital age of consent, and what the repercussions are for violation of underage users’ data privacy.
In the US, the protection of personal data is handled somewhat differently. The regulation that has the closest scope to that of GDPR is the California Consumer Privacy Act (CCPA), which expands the traditional, pre-digital definition of personal data to its current definition as anything that can map out a person’s digitally. It then goes on to guarantee California users various protections. For one thing, businesses must be transparent on their reasons for collecting data, allowing users to opt out at any time. Companies must also delete any user data upon that user’s request.
The Children Online Privacy Protection Act (COPPA) of 1998 and Health Insurance Portability and Accountability Act (HIPAA) are more familiar to most Americans thinking of data protection regulations. COPPA deals with the personal data of children younger than 13. This covers data of a minor that has been collected off of a website, from email, from a chat room, etc. However, COPPA is over two decades old, and the way children access the internet has evolved considerably since then. Similarly, HIPAA has certain blind spots. HIPAA covers Protected Health Information, meaning any data that has to do with a patient’s health insurance. It also covers medical data dealt with by health plans, healthcare providers, and healthcare clearinghouses.
How OpenBack Offers Data Privacy for Push Notifications
The crux of all this is that the way we interact with the digital world is changing. Our understanding of digital privacy is also changing, but there are different regulations in different parts of the world. How does this impact push notifications?
OpenBack uses a hybrid process of sending notifications, in which the standard cloud server structure is replaced by edge computing. The developer has the option to select a process in which data is leveraged on the device itself. This cuts the third-party servers out of the equation. Using this option, all notifications sent locally through the device’s OS framework comply with GDPR, HIPAA, COPPA, and CCPA by default. Rather than a cloud server, OpenBack itself acts as data processor in this model. Under this model of mobile engagement, app users retain full ownership of their personal data. They can also email OpenBack to request deletion of their data at any time.
To learn more about OpenBack’s hybrid model that utilizes edge computing to send highly personalized, real-time push notifications while remaining 100% regulation compliant,