A study published by the Sydney Medical School of Public Health on the 29th of last month has issued a rather grave warning to both Google and Apple. To be absolutely accurate, Google in fact received the majority of the bashing. In essence, the study casts light on the extent to which the tobacco industry is utilising the new 'smartphone app medium' as a tool to indiscriminately market to minors and how the two biggest players in that arena provide less than adequate precautionary safeguards.
The study claims, not only have Google and Apple failed to employ a proper framework to safeguard their users but in addition are in violation of the World Health Organisation's Framework Convention on Tobacco Control (WHO FCTC). See the section in question below.
(e) "undertake a comprehensive ban or, in the case of a Party that is not in a position to undertake a comprehensive ban due to its constitution or constitutional principles, restrict tobacco advertising, promotion and sponsorship on radio, television, print, media and, as appropriate, other media, such as the internet, within a period of five years."
In context, article 13 of the (WHO FCTC) details the obligations of compliant parties with regards to “tobacco advertising, promotion and sponsorship”. The section pasted above clearly states compliant parties are in breech of their contractual obligations should they fail to restrict the advertising of tobacco products via the sources of media listed, this includes online media sources.
The U.S signed as a participatory party in the (WHO FCTC) on 10.05.2004, meaning Google and Apple are obligated to comply. According to the study, the amount of 'pro smoking apps'...
“Any app that explicitly provided information about brands of tobacco, where to buy tobacco products, images of tobacco brands or cigarettes, and apps that might encourage smoking behaviour by providing smoking trigger cues, for example, smoking simulation apps that show a cigarette on the screen and ask the user to light it and smoke it.”
...identified in Apple's store (65) was larger than that found in Google's. However, the level of regulation employed by Apple was found to be that much more stringent. To elaborate, where Apple employs a pop up system designed to discourage potential users of unsuitable content, Google's classification system consists of a rather loose rating system of three levels: Low Maturity, Maturity and High Maturity. This begs the question, what policies are Google actually employing to regulate content unsuitable for minors in their app store?
Consider, of the 1000 results showing for smoking related keywords, 42 - English apps that met the aforementioned criteria - were identified in Google Play and were downloaded by a minimum of 6 225 786 users as of February 2012. This is accurate data in so much as is accessible to researchers. In addition, smoking apps are available under various categories. Categories that potentially expose pro smoking content to minors. Games and Entertainment are two such categories. We can only speculate the extent to which minors are in fact downloading pro smoking content as demographics data is also witheld. However, unsuitable content without safeguard ought to be enough to merit concern.
Not only do Google and Apple have a moral obligation to safeguard minors against content deemed unsuitable, but a legal obligation in addition. You would think if the moral obligation wasn't incentive enough, complying with the WHO's legislation would be. Sadly, Google has at this point failed to implement a system of regulation even close to that of their counterpart.
The concern relating to the availability of pro smoking content is undoubtedly justified but perhaps raises a more general question.
To what extent should platforms employ regulatory frameworks in the knowledge of hosting unsuitable content?
Let us know your take in the comments below!