An Australian Competition and Consumer Commission (ACCC) inquiry into digital platforms (as previously covered at Croakey) is an opportunity to tackle significant public health concerns, according to a review of some key submissions, as reported below.
Melissa Sweet writes:
Leading health organisations have added their weight to growing international momentum for a regulatory crackdown on internet behemoths like Facebook, amid concerns ranging from the stealthy marketing of unhealthy products to the spread of disinformation and skewing of policy and election outcomes.
The current regulatory framework is failing consumers and exposing them to a range of harms, while also undermining policy and political processes, according to submissions to an Australian Competition and Consumer Commission (ACCC) inquiry into digital platforms and its preliminary report.
The submissions call for greater regulation on multiple fronts, especially around the online targeting of minors and use of digital platforms to market unhealthy commodities such as alcohol, unhealthy food and gambling.
The Public Health Association of Australia (PHAA) submission calls for regulations to protect children from unhealthy marketing, to be applied to all media formats, including digital platforms.
Regulators should ensure all minors are automatically opted out of targeted online advertising and prohibit the use of children’s personal data for tracking, targeted advertising and other marketing strategies, with financial sanctions imposed for breaches of such regulations, the submission says.
The PHAA says it is particularly concerned that the lack of transparency and regulation of the use of these platforms is increasingly allowing policy and political campaigning as part of the operating model of corporations and industries that sell unhealthy goods or services.
“These campaigns include attempts to influence policy and regulatory decisions by parliaments and governments, as well as exercises to damage the reputations of political parties, individual politicians and even unelected officials who take public interest stances against such interests,” the association’s submission says.
The PHAA says “the impacts of unfettered intervention in political discourse by corporate interests through modern social media platforms is pervasive” and that powerful corporations can cause various “mischiefs” through digital platforms, such as creating false or ‘fake’ opinions within the community about basic facts and matters of science and medicine.
“They can mislead the public about the virtues (or vices) of various policy options, or about their practicality or effectiveness. They can engage in malicious damage to the reputations of, and indeed sabotage the careers of, politicians and also non-elected officials who make a stand against corporate interests.
“High-profile impacts on public discourses such as Brexit and the most recent US presidential election (and indeed many other elections) are only the tip of the iceberg. The same practices are operating constantly on a wide array of corporate-interest areas of public policy. Many relate to health policy and ultimately to the public’s actual health.”
The PHAA argues that the ACCC has jurisdiction over such concerns “because digital platforms now form part of the communications ‘infrastructure’ on which modern society operates”, and this infrastructure is used both for commercial purposes and by consumers for policy and political discourse.
“While much of this community ‘political’ usage is non-commercial in nature, the act of selling the use of platforms to the public as a means of political communication is itself a commercial activity, and thus creates a market, which is properly a domain of interest to the ACCC,” the PHAA says.
Meanwhile, concerns about the impact of Facebook on democratic processes are also highlighted in a recent, must-read report from the UK, by the House of Commons Digital, Culture, Media and Sport Committee: Disinformation and ‘fake news’: Final Report.
Amongst other things, the UK report calls for electoral law reform to enable “absolute transparency of online political campaigning, including clear, persistent banners on all paid-for political adverts and videos, indicating the source and the advertiser; a category introduced for digital spending on campaigns; and explicit rules surrounding designated campaigners’ role and responsibilities”.
The report says: “Electoral law is not fit for purpose and needs to be changed to reflect changes in campaigning techniques, and the move from physical leaflets and billboards to online, microtargeted political campaigning.”
It calls for social media companies to be pressured to publicise any instances of disinformation, including sharing information about foreign interference on their sites – including who has paid for political adverts, who has seen the adverts, and who has clicked on the adverts – with the threat of financial liability if such information is not forthcoming.
Wider regulatory push
The PHAA calls for all minors to be automatically opted out of targeted advertising and for a ban on the use of children’s personal data for tracking, targeted advertising and other marketing strategies are also backed in another submission, made by a consortium of health and related groups.
Signatories – including the Foundation for Alcohol Research and Education (FARE), the Alliance for Gambling Reform, the Australian Council on Children and the Media, the Australian Health Policy Collaboration, Cancer Council Australia, the Centre for Alcohol Policy Research, The George Institute Australia, Obesity Australia, the University of Sydney WHO Collaborating Centre for Physical Activity, the Public Health Advocacy Institute WA, and the Uniting Church in Australia’s Synod of Victoria and Tasmania – call for “immediate and decisive” regulatory action to protect children from the “unconscionable” marketing of unhealthy commodities.
The joint submission says:
It is the responsibility of our government to ensure that children are able to participate in the digital world without being targeted by marketers of alcohol, unhealthy food, gambling and other products potentially harmful to their health.
It is clear that marketers are currently using children’s data to target them with advertisements. This raises serious ethical questions regarding the monetisation and use of minors’ data – effectively selling the use of children’s data to marketing agencies.”
In a separate submission, FARE also highlights the risks to children from alcohol marketing online and urges that “consideration should be given to banning alcohol advertising entirely from some media formats, such as online”.
FARE highlights disparities in the regulation of alcohol marketing: alcohol advertising on traditional media platforms is regulated through a co-regulatory system that generally prohibits alcohol advertising during children’s viewing hours on free to air television, while the rules are different for subscription television, and there is no regulatory framework that restricts marketing on digital platforms or online.
The Foundation urges the ACCC to broaden its recommendations for a review of the media regulatory framework to specifically include the regulation of advertising and marketing, especially the marketing of harmful products like alcohol.
FARE is concerned about a lack of legislative protections to stop alcohol companies directly targeting vulnerable people through digital platforms, or digital platforms deliberately or inadvertently targeting vulnerable people.
An example of inadvertent targeting may be when someone who buys a lot of alcohol (because they are a dependent drinker) is served up more and more alcohol ads because the algorithms have rightly identified the person as interested in that content, the submission says.
The FARE submission also highlights ethical concerns surrounding the collection of minors’ personal data:
The current practice of digital platforms selling the use of children’s personal information, which is then stored indefinitely, is morally repugnant.
It also opens up scary possibilities for unethical marketers who may be looking for certain traits in data that indicate whether young people will be particularly receptive or vulnerable to alcohol advertising when they reach the legal age of consumption.”
FARE says any regulatory authority tasked with monitoring and reporting on digital platforms should not only report on advertisement ranking but also targeting and expenditure.
Advertisers and digital platforms should be compelled to report data on alcohol advertising including the parameters of their advertising requests, spend, targeting, effectiveness and audience, and this data should be submitted to the independent regulatory authority for analysis, monitoring and compliance.
“While we understand the rationale behind not publicly releasing detailed data provided by digital platforms, at a minimum, a regulatory authority should publish annual reports on high-level industry activity that make recommendations on dealing with both discriminatory and unconscionable conduct,” the Foundation said.
FARE also recommends that the ACCC’s suggestion for a campaign to improve online new literacy in the general community include broader media literacy issues, such as digital marketing:
Digital marketing content has developed in sophistication, now designed to be entertaining, immersive and engaging, thereby increasing the difficulties of distinguishing an advertisement by an adult consumer, let alone a child.
An increasing use of native content, interactive games, influencers and other novel marketing techniques further blur the lines between content and advertising, and have proven to be highly attractive to young people.
Media literacy is a key skill that young people need to develop in order to be able to critically engage with media and the increasing barrage of marketing messages. Research on smoking prevention programs has suggested that higher rates of media literacy are associated with reduced rates of smoking among adolescents.”
The UK report also stressed the importance of improving the community’s digital literacy, which it said “should be a fourth pillar of education, alongside reading, writing and maths”.
The UK committee recommended that a social media company levy be used to finance a comprehensive educational framework—developed by charities, NGOs, and the regulators themselves—to inform people of the implications of sharing their data willingly, their rights over their data, and ways in which they can constructively engage and interact with social media sites.
Despite strong, consistent messages from health groups about the need for consumer-focused regulation of digital platforms, the UK report suggests such advocates face a David and Goliath task, stating that Facebook seems “unwilling to be properly scrutinised” and “has continually hidden behind obfuscation”.
If the ACCC’s final report, due in June, follows through on recommendations in its preliminary report, Facebook et al are unlikely to cop it quietly on the chin.
The House of Commons committee describes a pattern of Facebook refusing to cooperate with inquiries and regulators, noting:
We invited Mark Zuckerberg, CEO of Facebook—the social media company that has over 2.25 billion users and made $40 billion in revenue in 2017—to give evidence to us and to this Committee; he chose to refuse, three times.
By choosing not to appear before the Committee and by choosing not to respond personally to any of our invitations, Mark Zuckerberg has shown contempt towards both the UK Parliament and the ‘International Grand Committee’, involving members from nine legislatures from around the world.”
The report cites Ashkan Soltani, an independent researcher and consultant, and former Chief Technologist to the US Federal Trade Commission, who called into question Facebook’s willingness to be regulated.
When discussing Facebook’s internal culture, Soltani said, “There is a contemptuousness—that ability to feel like the company knows more than all of you and all the policy makers”. He discussed the California Consumer Privacy Act, which Facebook supported in public, but lobbied against, behind the scenes.
Furthermore, the report warns that Facebook’s history of sharing users’ personal data is set to massively increase, “given the news that, by early 2020, Facebook is planning to integrate the technical infrastructure of Messenger, Instagram and WhatsApp, which, between them, have more than 2.6 billion users”.
And, finally, it is worth quoting the report at length as we approach the NSW and Federal elections in Australia:
We have always experienced propaganda and politically-aligned bias, which purports to be news, but this activity has taken on new forms and has been hugely magnified by information technology and the ubiquity of social media.
In this environment, people are able to accept and give credence to information that reinforces their views, no matter how distorted or inaccurate, while dismissing content with which they do not agree as ‘fake news’. This has a polarising effect and reduces the common ground on which reasoned debate, based on objective facts, can take place.
Much has been said about the coarsening of public debate, but when these factors are brought to bear directly in election campaigns then the very fabric of our democracy is threatened. This situation is unlikely to change.
What does need to change is the enforcement of greater transparency in the digital sphere, to ensure that we know the source of what we are reading, who has paid for it and why the information has been sent to us. We need to understand how the big tech companies work and what happens to our data.
Facebook operates by monitoring both users and non-users, tracking their activity and retaining personal data. Facebook makes its money by selling access to users’ data through its advertising tools. It further increases its value by entering into comprehensive reciprocal data-sharing arrangements with major app developers who run their businesses through the Facebook platform.
Meanwhile, among the countless innocuous postings of celebrations and holiday snaps, some malicious forces use Facebook to threaten and harass others, to publish revenge porn, to disseminate hate speech and propaganda of all kinds, and to influence elections and democratic processes—much of which Facebook, and other social media companies, are either unable or unwilling to prevent.
We need to apply widely-accepted democratic principles to ensure their application in the digital age. The big tech companies must not be allowed to expand exponentially, without constraint or proper regulatory oversight. But only governments and the law are powerful enough to contain them. The legislative tools already exist. They must now be applied to digital activity, using tools such as privacy laws, data protection legislation, antitrust and competition law.
If companies become monopolies they can be broken up, in whatever sector. Facebook’s handling of personal data, and its use for political campaigns, are prime and legitimate areas for inspection by regulators, and it should not be able to evade all editorial responsibility for the content shared by its users across its platforms.
In a democracy, we need to experience a plurality of voices and, critically, to have the skills, experience and knowledge to gauge the veracity of those voices. While the Internet has brought many freedoms across the world and an unprecedented ability to communicate, it also carries the insidious ability to distort, to mislead and to produce hatred and instability.
It functions on a scale and at a speed that is unprecedented in human history.
One of the witnesses at our inquiry, Tristan Harris, from the US-based Center for Humane Technology, describes the current use of technology as “hijacking our minds and society”. We must use technology, instead, to free our minds and use regulation to restore democratic accountability.”