<?xml version="1.0" encoding="UTF-8"?>
<?xml-stylesheet type="text/xsl" href="part2stratml.xsl"?>
<PerformancePlanOrReport xmlns="urn:ISO:std:iso:17469:tech:xsd:PerformancePlanOrReport" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"

 xsi:schemaLocation="urn:ISO:std:iso:17469:tech:xsd:PerformancePlanOrReport http://stratml.us/references/PerformancePlanOrReport20160216.xsd" Type="Strategic_Plan"><Name>Confronting Health Misinformation: The U.S. Surgeon General’s Advisory on Building a Healthy Information Environment</Name><Description>Because it pollutes our information environment, misinformation is harmful to individual and public
health. Together, we have the power to build a healthier information environment. Just as we have all
benefited from efforts to improve air and water quality, we can all benefit from taking steps to improve the
quality of health information we consume. Limiting the prevalence and impact of misinformation will help
all of us make more informed decisions about our health and the health of our loved ones and communities...
Addressing health misinformation will require a whole-of-society effort. We can start by focusing on the following areas of action: </Description><OtherInformation>A Surgeon General’s Advisory is a public statement that calls the American people’s attention to a public health issue and provides recommendations for how that issue should be addressed.  Advisories are reserved for significant public health challenges that need the American people’s immediate awareness. For additional background, visit SurgeonGeneral.gov.</OtherInformation><StrategicPlanCore><Organization><Name>U.S. Public Health Service</Name><Acronym>USPHS</Acronym><Identifier>_73eea5b8-39b3-11ec-933a-bca51983ea00</Identifier><Description/><Stakeholder StakeholderTypeType="Person"><Name>Vivek H. Murthy, M.D., M.B.A.</Name><Description>Vice Admiral, U.S. Public Health Service | Surgeon General of the United States ~
During the COVID-19 pandemic, there have been significant efforts to address health misinformation.  
Here are just a few examples:</Description></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Trusted Community Members</Name><Description>Trusted community members, such as health professionals, faith leaders, and
educators, have spoken directly to their communities to address COVID-19-related questions (e.g., in town halls, community meetings, via social and traditional media)</Description></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Health Professionals</Name><Description/></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Faith Leaders</Name><Description/></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Researchers</Name><Description>Researchers have identified leading sources of COVID-19 misinformation, including misinformation “super-spreaders”</Description></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Media Organizations</Name><Description>Media organizations have devoted more resources to identify and debunk
misinformation about COVID-1946</Description></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Technology Platforms</Name><Description>Some technology platforms have improved efforts to monitor and address
misinformation by reducing the distribution of false or misleading posts and
directing users to health information from credible sources</Description></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Governments</Name><Description>Governments have increased their efforts to disseminate clear public health
information in partnership with trusted messengers</Description></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Each of Us</Name><Description>But there is much more to be done, and each of us has a role to play. Before posting or sharing an item on social media, for example, we can take a moment to verify whether the information is accurate and whether the original source is trustworthy. If we're not sure, we can choose not to share. When talking to friends and family who have misperceptions, we can ask questions to understand their concerns, listen with empathy, and offer guidance on finding sources of accurate information.</Description></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Society</Name><Description>It will take more than individual efforts, however, to address health misinformation. The threat of misinformation raises important questions we must answer together: How do we curb the spread of harmful misinformation while safeguarding user privacy and free expression? What kinds of measures should technology platforms, media entities, and other groups adopt to address misinformation? What role is appropriate for the government to play? How can local communities ensure that information being exchanged—online and offline—is reliable and trustworthy? How can we help family and friends who may have been exposed to harmful misinformation?</Description></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Individuals</Name><Description>What Individuals, Families, and Communities Can Do ...
* Learn how to identify and avoid sharing
health misinformation. ~ When many of us share
misinformation, we don’t do it intentionally: We
are trying to inform others and don’t realize the
information is false. Social media feeds, blogs,
forums, and group chats allow people to follow
a range of people, news outlets, and official
sources. But not every post on social media can
be considered reliable. And misinformation can
flourish in group texts or email threads among
friends and family. Verify accuracy of information
by checking with trustworthy and credible
sources. If you’re not sure, don’t share. 

* Engage with your friends and family on the
problem of health misinformation. ~ If someone
you care about has a misperception, you might be
able to make inroads with them by first seeking
to understand instead of passing judgment. Try
new ways of engaging: Listen with empathy,
establish common ground, ask questions,
provide alternative explanations and sources of
information, stay calm, and don’t expect success
from one conversation.

* Address health misinformation in your
community. ~ Work with schools, community
groups such as churches and parent-teacher
associations, and trusted leaders such as educators
and health care professionals to develop local
strategies against misinformation. For example,
invite local health professionals to schools or to
faith congregations to talk about COVID-19
vaccine facts.</Description></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Families</Name><Description/></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Communities</Name><Description/></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Educational Institutions</Name><Description>What Educators and Educational Institutions Can Do ... 
* Strengthen and scale the use of evidence-based
educational programs that build resilience
to misinformation. ~ Media, science, digital,
data, and health literacy programs should be
implemented across all educational settings,
including elementary, secondary, post-secondary
and community settings. In addition to teaching
people how to be more discerning about the
credibility of news and other content, educators
should cover a broader set of topics, such as
information overload, internet infrastructure
(e.g., IP addresses, metadata), the challenges of
content moderation, the impact of algorithms
on digital outputs, algorithmic bias, artificial
intelligence (AI)-generated misinformation (e.g.,
deepfakes), visual verification skills, and how
to talk to friends and family who are sharing
misinformation. 

* Educate students and the public on
common tactics used by those who spread
misinformation online. ~ Recent research
suggests that teaching people how to spot
these tactics can reduce people's willingness
to share misinformation. Examples of
misinformation tactics used by those who deny
scientific consensus on health issues include
presenting unqualified people as experts;
misleading consumers with logical fallacies;
setting impossible expectations for scientific
research; cherry-picking data or anecdotes; and
introducing conspiracy theories.

* Establish quality metrics to assess progress in
information literacy. ~ While there is substantial
media and information literacy work being
carried out across the United States, there is
a need for more consistent and empirically
evaluated educational materials and practices.</Description></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Educators</Name><Description/></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Health Organizations</Name><Description>What Health Professionals and Health Organizations Can Do ... 
* Proactively engage with patients and the
public on health misinformation. ~ Doctors,
nurses, and other clinicians are highly trusted
and can be effective in addressing health
misinformation. If you are a clinician, take the
time to understand each patient’s knowledge,
beliefs, and values. Listen with empathy,
and when possible, correct misinformation
in personalized ways. When addressing
health concerns, consider using less technical
language that is accessible to all patients. Find
opportunities to promote patient health literacy
on a regular basis.

* Use technology and media platforms to share
accurate health information with the public. ~ 
For example, professional associations can equip
their members to serve as subject matter experts
for journalists and effectively communicate peer-reviewed research and expert opinions online. 

* Partner with community groups and other
local organizations to prevent and address
health misinformation. ~ For example, hospital
systems can work with community members
to develop localized public health messages.
Associations and other health organizations
should offer trainings for clinicians on how to
address misinformation in ways that account for
patients’ diverse needs, concerns, backgrounds,
and experiences.</Description></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Health Professionals</Name><Description/></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Media Organizations</Name><Description>What Journalists and Media Organizations Can Do ...
* Train journalists, editors, and others to
recognize, correct, and avoid amplifying
misinformation. ~ Media organizations should
develop in-house training programs and partner
with journalism schools, nonprofits, technology
platforms, and others to democratize access to
high-quality training for all media outlets.

* Proactively address the public’s questions. ~ 
When something is new—such as a vaccine—
people will understandably have questions. By
anticipating and proactively answering those
questions, media organizations and journalists can
help get ahead of misinformation and increase the
public’s health and information literacy.

* Provide the public with context to avoid
skewing their perceptions about ongoing
debates on health topics. ~ For example, when
discussing conflicting views on an issue, give
readers a sense of where the scientific community
stands and how strong the available evidence is
for different views. Consider questions like: How
much disagreement is there among experts? Is a
given explanation plausible even if it is unlikely?
If evidence is not equally strong on all sides of an
issue, avoid presenting it as such.

* Carefully review information in preprints. ~ 
Preprints are research papers published online
before peer review. They can provide scientists
and the public with useful information, especially
in rapidly evolving situations such as a pandemic.
However, because preprints have not been
independently reviewed, reporters should be
careful about describing findings from preprints
as conclusive. If reporting on such findings,
include strong caveats where appropriate, seek
out expert opinions, and provide readers with
context.

* Use a broader range of credible sources—
particularly local sources. ~ Research shows
us that people have varying levels of trust in
different types of people and institutions. In
addition to relying on federal and state public
health authorities as sources, build relationships
with local health professionals and local trusted,
credible health organizations.

* Consider headlines and images that inform
rather than shock or provoke. ~ Headlines are
often what audiences will see and remember.
If a headline is designed to fact-check a rumor,
where possible, lead with the truth instead of
simply repeating details of the rumor. Images
are often shared on social media alongside
headlines and can be easily manipulated and used
out of context. Picture desk and social media
editors should consider how provocative and
medically inaccurate imagery can be a vehicle for
misinformation.</Description></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Journalists</Name><Description/></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Technology Platforms</Name><Description>What Technology Platforms Can Do ... 
* Assess the benefits and harms of products
and platforms and take responsibility for
addressing the harms. ~ In particular, make
meaningful long-term investments to address
misinformation, including product changes.
Redesign recommendation algorithms to avoid
amplifying misinformation, build in “frictions”—
such as suggestions and warnings—to reduce the
sharing of misinformation, and make it easier for
users to report misinformation.

* Give researchers access to useful data to
properly analyze the spread and impact of
misinformation. ~ Researchers need data on
what people see and hear, not just what they
engage with, and what content is moderated
(e.g., labeled, removed, downranked), including
data on automated accounts that spread
misinformation. To protect user privacy, data can
be anonymized and provided with user consent.

* Strengthen the monitoring of misinformation. ~ 
Platforms should increase staffing of multilingual
content moderation teams and improve the
effectiveness of machine learning algorithms
in languages other than English since non-English-language misinformation continues to proliferate. Platforms should also address
misinformation in live streams, which are more
difficult to moderate due to their temporary
nature and use of audio and video.

* Prioritize early detection of misinformation
"super-spreaders" and repeat offenders. ~ Impose
clear consequences for accounts that repeatedly
violate platform policies.

* Evaluate the effectiveness of internal policies
and practices in addressing misinformation
and be transparent with findings. ~ Publish
standardized measures of how often users are
exposed to misinformation and through what
channels, what kinds of misinformation are most
prevalent, and what share of misinformation is
addressed in a timely manner. Communicate why
certain content is flagged, removed, downranked,
or left alone. Work to understand potential
unintended consequences of content moderation,
such as migration of users to less-moderated
platforms.

* Proactively address information deficits. ~ An
information deficit occurs when there is high
public interest in a topic but limited quality
information available. Provide information
from trusted and credible sources to prevent
misconceptions from taking hold.

* Amplify communications from trusted
messengers and subject matter experts. ~ 
For example, work with health and medical
professionals to reach target audiences. Direct
users to a broader range of credible sources,
including community organizations. It can be
particularly helpful to connect people to local
trusted leaders who provide accurate information.
Prioritize protecting health professionals,
journalists, and others from online
harassment, including harassment resulting from
people believing in misinformation.</Description></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Research Institutions</Name><Description>What Researchers and research institutions Can Do ... 
* Strengthen the monitoring of health questions,
concerns, and misinformation. ~ Focus on a
broader range of content and platforms, as
well as on information flow across platforms.
For example, examine image- and video-based
content and content in multiple languages. To
address existing research limitations, expand data
collection methods (e.g., recruit social media
users to voluntarily share data).

* Assess the impact of health misinformation. ~ 
There is an urgent need to comprehensively
quantify the harms of health misinformation.
For example, how and under what conditions
does misinformation affect beliefs, behaviors,
and health outcomes? What is the role of
emotion, cognition, and identity in causing
misinformation to “stick”? What is the cost to
society if misinformation is left unchecked?

* Prioritize understanding how people are
exposed to and affected by misinformation,
and how this may vary for different
subpopulations. ~ Tailor interventions to the
needs of specific populations. Invite community
members to participate in research design.

* Evaluate the effectiveness of strategies
and policies to prevent and address health
misinformation. ~ For example, can flagging
certain content as misinformation have
unintended consequences? Is it possible to build
resilience to misinformation through inoculation
methods such as “prebunking”? (Debunking
involves correcting misinformation once
someone has been exposed to it. Prebunking,
or preemptively debunking, involves warning
people about misinformation they might come
across so they will be less likely to believe it when
exposed.)</Description></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Researchers</Name><Description/></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Foundations</Name><Description>What Funders and Foundations Can Do ... 
* Move with urgency toward coordinated, at-scale investment to tackle misinformation. ~ Assess funding portfolios to ensure meaningful,
multi-year commitments to promising research
and programs.

* Invest in quantifying the harms of misinformation and identifying evidence-based interventions. ~ Focus on areas facing private and public funding gaps. Examples could include independent and local journalism,
accountability mechanisms for platforms, and
community-based health literacy programs.
Provide training and resources for grantees
working in communities disproportionately
affected by misinformation (e.g., areas with
lower vaccine confidence).

* Incentivize coordination across grantees to
maximize reach, avoid duplication, and bring
together a diversity of expertise. ~ For example,
encourage coordination around monitoring
health misinformation across multiple languages.</Description></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Funders</Name><Description/></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Governments</Name><Description>What Governments Can Do ...
* Convene federal, state, local, territorial, tribal,
private, nonprofit, and research partners ~ to
explore the impact of health misinformation,
identify best practices to prevent and address
it, issue recommendations, and find common
ground on difficult questions, including
appropriate legal and regulatory measures that
address health misinformation while protecting
user privacy and freedom of expression.

* Increase investment in research on
misinformation. ~ For example, more research
is needed to better define misinformation,
document and process its harms, and identify
best practices for preventing and addressing
misinformation across mediums and diverse
communities.

* Continue to modernize public health
communications. ~ Work to understand
Americans’ health questions, concerns, and
perceptions, especially for hard-to-reach
populations. Deploy new messaging and
community engagement strategies, including
partnerships with trusted messengers. Proactively
and rapidly release accurate, easy-to-understand
health information in online and in-person
settings. Invest in fact-checking and rumor
control mechanisms where appropriate.

* Increase resources and technical assistance to
state and local public health agencies to help
them better address questions, concerns, and
misinformation. ~ For example, support the creation
of teams within public health agencies that can
identify local misinformation patterns and train
public health misinformation and infodemic
researchers. Work with local and state health
leaders and associations to address ongoing needs.

* Expand efforts to build long-term resilience
to misinformation. ~ For example, promote
educational programs that help people
distinguish evidence-based information from
opinion and personal stories.</Description></Stakeholder></Organization><Vision><Description>A healthier information environment</Description><Identifier>_73eea766-39b3-11ec-933a-bca51983ea00</Identifier></Vision><Mission><Description>To limit the prevalence and impact of misinformation</Description><Identifier>_73eea86a-39b3-11ec-933a-bca51983ea00</Identifier></Mission><Value><Name/><Description/></Value><Goal><Name>Tools</Name><Description>Equip Americans with tools</Description><Identifier>_73eea9fa-39b3-11ec-933a-bca51983ea00</Identifier><SequenceIndicator>Action 1</SequenceIndicator><Stakeholder StakeholderTypeType="Generic_Group"><Name>Americans</Name><Description/></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Trusted Local Leaders</Name><Description/></Stakeholder><OtherInformation>to identify misinformation, make informed choices about what information they share, and address health misinformation in their communities, in partnership with trusted local leaders</OtherInformation><Objective><Name>Identification</Name><Description>Identify misinformation</Description><Identifier>_73eeaaf4-39b3-11ec-933a-bca51983ea00</Identifier><SequenceIndicator>1.1</SequenceIndicator><Stakeholder><Name/><Description/></Stakeholder><OtherInformation/></Objective><Objective><Name>Sharing</Name><Description>Make informed choices about what information they share</Description><Identifier>_2f2b9382-403e-11ec-8b84-62371c83ea00</Identifier><SequenceIndicator>1.2</SequenceIndicator><Stakeholder><Name/><Description/></Stakeholder><OtherInformation/></Objective><Objective><Name>Misinformation</Name><Description>Address health misinformation in their communities</Description><Identifier>_2f2b959e-403e-11ec-8b84-62371c83ea00</Identifier><SequenceIndicator>1.3</SequenceIndicator><Stakeholder StakeholderTypeType="Generic_Group"><Name>Communities</Name><Description/></Stakeholder><OtherInformation/></Objective></Goal><Goal><Name>Research</Name><Description>Expand research that deepens our understanding of health misinformation</Description><Identifier>_73eeabda-39b3-11ec-933a-bca51983ea00</Identifier><SequenceIndicator>Action 2</SequenceIndicator><Stakeholder><Name/><Description/></Stakeholder><OtherInformation>including how it spreads and evolves; how and why it impacts people; who is most susceptible; and which strategies are most effective in addressing it</OtherInformation><Objective><Name>Spread &amp; Evolution</Name><Description>Deepen understanding of how health misinformation spreads and evolves</Description><Identifier>_73eeacca-39b3-11ec-933a-bca51983ea00</Identifier><SequenceIndicator>2.1</SequenceIndicator><Stakeholder><Name/><Description/></Stakeholder><OtherInformation/></Objective><Objective><Name>Impact</Name><Description>Deepen understanding of how and why health misinformation impacts people</Description><Identifier>_2f2b9710-403e-11ec-8b84-62371c83ea00</Identifier><SequenceIndicator>2.2</SequenceIndicator><Stakeholder><Name/><Description/></Stakeholder><OtherInformation/></Objective><Objective><Name>Susceptibility</Name><Description>Deepen understanding of who is most susceptible to health misinformation</Description><Identifier>_2f2b9904-403e-11ec-8b84-62371c83ea00</Identifier><SequenceIndicator>2.3</SequenceIndicator><Stakeholder><Name/><Description/></Stakeholder><OtherInformation/></Objective><Objective><Name>Strategies</Name><Description>Deepen understanding of which strategies are most effective in addressing health misinformation</Description><Identifier>_2f2b9a62-403e-11ec-8b84-62371c83ea00</Identifier><SequenceIndicator>2.4</SequenceIndicator><Stakeholder><Name/><Description/></Stakeholder><OtherInformation/></Objective></Goal><Goal><Name>Technology Platforms</Name><Description>Implement product design and policy changes on technology platforms to slow
the spread of misinformation</Description><Identifier>_73eeada6-39b3-11ec-933a-bca51983ea00</Identifier><SequenceIndicator>Action 3</SequenceIndicator><Stakeholder StakeholderTypeType="Generic_Group"><Name>Technology Platform Developers</Name><Description/></Stakeholder><OtherInformation/><Objective><Name>Products</Name><Description>Implement product design changes</Description><Identifier>_73eeae8c-39b3-11ec-933a-bca51983ea00</Identifier><SequenceIndicator>3.1</SequenceIndicator><Stakeholder><Name/><Description/></Stakeholder><OtherInformation/></Objective><Objective><Name>Policies</Name><Description>Implement policy changes</Description><Identifier>_2f2b9bca-403e-11ec-8b84-62371c83ea00</Identifier><SequenceIndicator>3.2</SequenceIndicator><Stakeholder><Name/><Description/></Stakeholder><OtherInformation/></Objective></Goal><Goal><Name>Resilience</Name><Description>Invest in longer-term efforts to build resilience against health misinformation</Description><Identifier>_73eeb062-39b3-11ec-933a-bca51983ea00</Identifier><SequenceIndicator>Action 4</SequenceIndicator><Stakeholder StakeholderTypeType="Generic_Group"><Name>Health Practitioners</Name><Description/></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Journalists</Name><Description/></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Librarians</Name><Description/></Stakeholder><OtherInformation>such as media, science, digital, data, and health literacy programs and training for health practitioners, journalists, librarians, and others</OtherInformation><Objective><Name>Literacy Programs</Name><Description>Invest in media, science, digital, data, and health literacy programs</Description><Identifier>_73eeb13e-39b3-11ec-933a-bca51983ea00</Identifier><SequenceIndicator>4.1</SequenceIndicator><Stakeholder><Name/><Description/></Stakeholder><OtherInformation/></Objective><Objective><Name>Training</Name><Description>Invest in training</Description><Identifier>_2f2b9d46-403e-11ec-8b84-62371c83ea00</Identifier><SequenceIndicator>4.2</SequenceIndicator><Stakeholder><Name/><Description/></Stakeholder><OtherInformation/></Objective></Goal><Goal><Name>Convenings</Name><Description>Convene federal, state, local, territorial, tribal, private, nonprofit, and research
partners</Description><Identifier>_73eeb31e-39b3-11ec-933a-bca51983ea00</Identifier><SequenceIndicator>Action 5</SequenceIndicator><Stakeholder StakeholderTypeType="Generic_Group"><Name>Federal Partners</Name><Description/></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>State Partners</Name><Description/></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Local Partners</Name><Description/></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Territorial Partners</Name><Description/></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Tribal Partners</Name><Description/></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Private Partners</Name><Description/></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Nonprofit Partners</Name><Description/></Stakeholder><Stakeholder StakeholderTypeType="Generic_Group"><Name>Research Partners</Name><Description/></Stakeholder><OtherInformation/><Objective><Name>Impact</Name><Description>Explore the impact of health misinformation</Description><Identifier>_73eeb418-39b3-11ec-933a-bca51983ea00</Identifier><SequenceIndicator>5.1</SequenceIndicator><Stakeholder><Name/><Description/></Stakeholder><OtherInformation/></Objective><Objective><Name>Best Practices</Name><Description>Identify best practices to prevent and address it</Description><Identifier>_73eeb4fe-39b3-11ec-933a-bca51983ea00</Identifier><SequenceIndicator>5.2</SequenceIndicator><Stakeholder><Name/><Description/></Stakeholder><OtherInformation/></Objective><Objective><Name>Recommendations</Name><Description>Issue recommendations</Description><Identifier>_73eeb5e4-39b3-11ec-933a-bca51983ea00</Identifier><SequenceIndicator>5.3</SequenceIndicator><Stakeholder><Name/><Description/></Stakeholder><OtherInformation/></Objective><Objective><Name>Consensus</Name><Description>Find common ground on difficult questions, including appropriate legal and regulatory measures that address health misinformation while protecting user privacy and freedom of expression</Description><Identifier>_73eeb6ca-39b3-11ec-933a-bca51983ea00</Identifier><SequenceIndicator>5.4</SequenceIndicator><Stakeholder><Name/><Description/></Stakeholder><OtherInformation/></Objective></Goal></StrategicPlanCore><AdministrativeInformation><StartDate/><EndDate/><PublicationDate>2021-11-07</PublicationDate><Source>https://www.hhs.gov/sites/default/files/surgeon-general-misinformation-advisory.pdf</Source><Submitter><GivenName>Owen</GivenName><Surname>Ambur</Surname><PhoneNumber/><EmailAddress>Owen.Ambur@verizon.net</EmailAddress></Submitter></AdministrativeInformation></PerformancePlanOrReport>