THE DAN NESTLE SHOW IS NOW THE TRENDING COMMUNICATOR - NEW EPISODES EVERY 2 WEEKS
May 3, 2024

Toward an AI-Enabled Communications Profession - with Anne Green

Toward an AI-Enabled Communications Profession - with Anne Green

Like it or not, Generative AI is becoming part of the communications toolkit. But unlike any technology that has preceded it, AI is able to perform tasks and produce results that have always been the domain of PR professionals - especially at the...

The player is loading ...
The Trending Communicator

Like it or not, Generative AI is becoming part of the communications toolkit. But unlike any technology that has preceded it, AI is able to perform tasks and produce results that have always been the domain of PR professionals - especially at the entry-level. And that's just today; AI is only going to get better. How can communicators and marketers adapt, adopt, and advance in this new, AI-enabled reality?

In this episode of The Trending Communicator, host Dan Nestle dives deep into the challenges and opportunities AI presents for communications professionals with Anne Green, CEO of G&S Business Communications. Anne, an award-winning communicator, thought leader, and advocate for ethics within the PR profession, is leading the charge toward AI enablement within her agency and for her clients. But she’s keenly aware of the technology's black-box nature and, in the episode, outlines an approach that ensures ethical adoption and enablement.

Anne and Dan delve into these ethical considerations and much more as they examine the evolving role of AI in marketing and communications. They discuss the importance of approaching AI with a critical and ethical lens, considering its impact on human interactions and the industry's hiring practices. At the same time, they explore how AI can enhance strategic thinking and help communications professionals deliver greater impact to their businesses. Anne also shares her concerns about online divisiveness and disinformation, emphasizing resilience and big-picture focus. The episode concludes with a positive outlook on human creativity and the importance of staying engaged in societal discussions.

Listen in and hear about...

  • Ethical dilemmas for marketers in the tech and AI revolution.
  • The critical role of ethics in AI-driven marketing.
  • AI's growing business role and the imperative of strategic adoption.
  • The need to find a personal advantage in new tech adoption.
  • Adapting to AI's impact on communication and thought.
  • The importance of bridging AI and organizational goals.
  • The rise of online disinformation and its societal impact.
  • Optimism in the human capacity for innovation.

 

Notable Quotes

  • [05:50] "I want to see writing that's electric, that leaps off the page, even if it's a marketing pitch or something." - Anne Green
  • [20:13] "There's a lot of misperceptions about what this industry is about, but like anything in human life, there are the angels and the bad actors." - Anne Green
  • [29:01] "I want to invite people to think about big picture questions like what will it mean to be human? What does authorship mean? What does it mean to have co-intelligence or augmented intellect?" - Anne Green
  • [41:08] "In the case of generative AI, you need to approach it as if you are about to have an interaction." - Dan Nestle
  • [01:05:01] "I am kept up with the proliferation of divisiveness and fake stuff online, and very much engineered disinformation, which is upsetting to me." - Anne Green

 

Resources & Links

Dan Nestle

Anne Green

 

Timestamped summary for this episode (generated by ChatGPT)

[00:01:15] Anne Green's career path

Anne Green discusses her career path, from an internship to becoming CEO, highlighting the accidental nature of her journey.

[00:06:10] Reflecting on past experiences

Dan Nestle and Anne Green reminisce about their past experiences, specifically their connection through Cooper Katz and the impact of their mentors.

[00:10:56] The enduring importance of writing

Dan Nestle emphasizes the timeless importance of writing and critical thinking in the PR profession, despite technological advancements.

[00:13:14] Adapting to changes in communications

Anne Green reflects on the changes in communication, from the advent of social media to the impact of technology on writing and critical thinking.

[00:16:14] The role of PR in upholding quality

Dan Nestle discusses the frustration of spin in PR and the need for the profession to uphold quality, truth, and critical thinking.

[00:19:22] Challenges in content quality

Dan Nestle and Anne Green acknowledge the prevalence of low-quality content and the need for PR to play a role in promoting good writing and ethical standards.

[00:19:31] Ethical Awareness in the Industry

Discussion about ethical awareness in the communications field and the responsibility to maintain integrity.

[00:21:15] Adoption of New Technology

Exploring the industry's capability to use new technology tools and the need to bring an ethical lens to their usage.

[00:24:46] Impact of AI on the Profession

Discussion on the changing landscape of the communications profession due to the rise of artificial intelligence and the need for professionals to adapt.

[00:35:17] The Personal Relationship with Technology

Exploring the personal impact of technology and the process of finding "killer apps" that create a personal connection with technology.

[00:38:23] Projection into the Future

Contemplation on the profound changes brought about by new technology and the challenges of projecting forward in a rapidly changing industry.

[00:39:16] The complexity of AI

Discussion about the evolving security measures and the need to adapt to changing technological threats.

[00:40:04] Understanding the "black box" of AI

Exploring the mystery and complexity of AI's decision-making processes and the work being done to uncover its inner workings.

[00:41:04] Learning a new way of speaking to AI

Adapting communication styles to interact effectively with AI, emphasizing the need for a different approach compared to traditional search engines.

[00:42:07] Refining questions for AI interaction

Emphasizing the need for iterative and refined questioning when engaging with AI to extract the desired information.

[00:44:14] Adapting to conversational AI

Drawing parallels between the evolution of search engines and the need to adapt to conversational AI, highlighting the shift in interaction styles.

[00:45:55] Challenges and opportunities with AI adoption

Discussing the learning curve, mindset shift, and the potential for leveraging AI to work smarter and add value.

[00:53:19] AI's impact on communication professionals

Exploring the unique skill set of communication professionals and their potential to excel in leveraging AI for business and client needs.

[00:55:42] Evolving roles and skill sets in the AI era

Emphasizing the elite nature of communication profession and the need to recognize, practice, and adapt to the skill sets required in the AI era.

[00:59:02] The role of a communicator

Understanding the depth of knowledge needed about a business and its stakeholders.

[01:01:00] Enhancing communication through AI

Using AI to enhance communication by connecting dots and providing additional insights.

[01:05:01] Challenges and hopes for the future

Concerns about disinformation and divisiveness, along with hopes for a better future and the role of AI.

[01:06:55] Looking ahead with optimism

Expressing hope for positive changes through human creativity and invention, and the potential of AI.

 

Transcript
1
00:00:00,000 --> 00:00:04,200
Daniel Nestle: welcome or
welcome back to the trending
communicator. I'm your host, Dan
Nestle.

2
00:00:13,920 --> 00:00:28,020
I keep saying that the
communications profession is
dealing with unprecedented
change. I mean, that's why I
launched trending communicator.
If we can get a hold of the
trends and the changes happening
inside and outside of the
marketing and communications
professions, then we'll be

3
00:00:28,020 --> 00:00:46,080
better at our jobs and better
advisors, better communicators.
Today, let's take a broad look
at some of what's happening in
the world of integrated
marketing communications with
someone who by any standard of
measure is a leader and
contributor to the profession as
a whole. Over three decades,

4
00:00:46,350 --> 00:01:02,340
she's blazed a career path that
so many PR pros dream of working
her way up the agency ladder
from her start as an associate
at Burson Marsteller, to
building leading and growing an
award winning agency to
shepherding that agency through
an acquisition and ultimately
becoming a partner and CEO of

5
00:01:02,340 --> 00:01:20,670
the new combined agency. Along
the way, she chalked up honor
after honor, including the PR
week 40 under 40, multiple
appearances in the PR week,
global PowerBook PR news Top
Women in PR can go on and on.
She is committed advocate for
diversity inclusion ethics in
our profession. So much more.

6
00:01:21,120 --> 00:01:28,350
Please welcome the CEO of GNS
business communications and my
friend, and green and it is good
to see you. Hey,

7
00:01:28,350 --> 00:01:36,270
Anne Green: Dan, so glad to be
here. It's so kind, I'm laughing
about how long ago the 40 under
40 award was, but it's nice to
hear about it sometimes,

8
00:01:36,480 --> 00:01:52,620
Daniel Nestle: well, we don't
have to specify dates on this
show. I mean, you and I are of
the same sort of generation and,
you know, that's sort of under
the under the banner of no one
really needs to know, but but
they'll figure it out. So, you
know, things we talked about. So
forgive the references

9
00:01:52,620 --> 00:02:11,970
sometimes, as we as we jam here,
you've had a great career. And,
and I am, you know, I know that
a lot of our listeners are in
our world, and you know, they're
in agencies, or they're on the
client side. I think, you know,
before we get into all the stuff
and buy stuff, I mean, you know,
the changes happening the

10
00:02:11,970 --> 00:02:27,690
profession, and by changes
happening the profession, I kind
of think we're gonna be talking
a lot about AI but, but there's
a lot of other things too. But
before we get into that, I'd
love for our listeners to know a
little more about you. And you
know, if you can just give a
little overview of, you know,

11
00:02:27,690 --> 00:02:43,740
how you got to where you are,
because that career path that
you've had is almost textbook
of, of what's a success in PR,
especially on the agency side
look like? So how did you do it?
And and what were the
highlights? And, you know, tell
us

12
00:02:43,800 --> 00:03:00,360
Anne Green: Yeah, it seems like
really rational and logical. And
what's funny is I'm it's really
quite accidental, in some ways
my undergraduate work was highly
liberal arts focused. I was
lucky enough to and I consider
it a privilege to be able to do
an internship, you know, that's
not something I take for

13
00:03:00,360 --> 00:03:15,870
granted. It's something that I
think all of us really advocate
for trying to give more support
for those who that may be
harder. But you know, I intended
to be a literature professor.
And in fact, during my time at
Burson Marsteller, right after
it and while I was helping to
build Cooper cats, which is the

14
00:03:15,870 --> 00:03:35,160
agency that you refer to in the
middle of the journey, before
GNS actually went to NYU and did
most of my PhD in American
literature, in the highly
relevant area of 19th century
American literature with a focus
on Edgar Allan Poe. So that's a
whole lot of goodness. But I
love teaching. But I wanted to

15
00:03:35,160 --> 00:03:48,510
not go straight to grad school.
And so I was looking for
internships. And I found the
Harold Burson Summer Internship
Program, which was Burton's
flagship internship, it was
competitive, it was 10 people
every summer in New York and
some of the other offices and I,
you know, didn't really have

16
00:03:48,510 --> 00:04:06,000
access to Internet back in those
days in the late 80s, early 90s.
But I was able to intuit what PR
was, and, you know, make it
through writing phases, and then
interviewing and got that
internship and ended up going
there after graduation from
Vassar College. And I think what
was great is I was like, let me

17
00:04:06,000 --> 00:04:22,050
take some time. Let me work. Let
me just explore the world before
I go to grad school. So I spent
a few years in person and I was
lucky enough to be there at a
time where there were some very
senior people that Harold Burson
was still alive. He used to ride
the elevator with him, which was
inspiring. Bill Marsteller had

18
00:04:22,050 --> 00:04:39,210
passed, but I was there with a
lot of people that had helped
build the firm from the 70s. And
some of them were starting to
move on to start their own
agencies. But I got to start in
b2b and consumer marketing. So I
was working with like world
class brands, both on b2b and
b2c. Then it was a great

19
00:04:39,210 --> 00:04:53,940
training ground. I mean, I
learned so much, and I was in
factories, and I was in like, I
was just learning how things
were made. And learning the
craft. And verson, of course,
was the largest agency in the
world at that time. So we had a
newsroom. I could see the AP
wire and we had a creative team

20
00:04:53,940 --> 00:05:10,770
we had, they would help us make
the slides, you know, and we had
early access to the web and we
We you know, and I met Ralph
cats and Andy Cooper who were
very senior, and they formed
Cooper cats. And so even as I
went off to grad school, I said
to myself, well, I'll go help
Andy and Ralph, I'll freelance

21
00:05:10,770 --> 00:05:28,170
for a while while I'm doing my
coursework. And then of course,
that was 96. And 22 years later,
I had left a dissertation stage,
like I'd said goodbye to
academia, which was fine. You
know, I made that decision open
hearted. I loved it. But there
were reasons to leave because I
loved our field, too. And I was

22
00:05:28,200 --> 00:05:44,910
I ended up CEO and a partner in
the firm. And, you know, and we
were, you know, $6 million
agency by that time and 330
people. So it is an interesting
thing, because once I got into
the career, it seems very
linear, but I really am an
accidental CEO to I don't have
an MBA I never intended, but I

23
00:05:44,910 --> 00:06:02,010
love organizational leadership.
So I think the loving the
subject matter, the curiosity
about that, I mean, we so many
different subjects you learn in
this field. And I know your
career has been really diverse
down, right? Yeah, subjects. And
then the love of leading people
and leading organizations. And

24
00:06:02,400 --> 00:06:10,860
it's, it's not for the faint of
heart, but it's so rewarding. So
yeah, that's my arc, which is
it, which is fun to reflect on.
At this point, you

25
00:06:10,860 --> 00:06:31,740
Daniel Nestle: make me want to
go down memory lane, just a
little bit there. Because now
because Cooper cats, that's our
connection, originally, I had no
cats. And, you know, I had the
privilege and honor I suppose
to, to meet Andy, before he
passed. And this was 15 years
ago or so. I was working at the

26
00:06:31,800 --> 00:06:54,150
AICPA at the time, and we we
held the pitch and, and hired
hired Cooper cast as a result of
that, to help us with some of
the things we were doing. Killer
team, so much tension. But the
main, the main thing I remember,
from those days was how
attentive and kind Andy

27
00:06:54,150 --> 00:06:56,310
Anne Green: and Ralph were. And,

28
00:06:57,450 --> 00:07:18,480
Daniel Nestle: you know, it's
the kind of thing where you
understand the value of that
client, or this rather than I
used to hate a client, but the
partnership that you have with
your, with your clients and with
with the agency, and it's, you
know, it taught me a lot about
what a good agency relationship

29
00:07:18,480 --> 00:07:36,750
should look like, you know, and
then, thankfully, you know, as
time went on, I stayed in touch
with Ralph, and, you know, and
this is a reminder for me, I
gotta reach out to Ralph again,
because we used to catch up for
lunch every couple of years.
He's doing great. You know, a
and believe me, we're gonna send

30
00:07:36,750 --> 00:08:02,250
him this show when it's done.
But But what a you know, what,
what a man says, I would say,
but he's he is just such a
genuinely a genuine leader in
the, in the PR world, and in
client relationships, and to
have learned and and been at his
side, and Andy side, too, I
suppose, must have been just, I

31
00:08:02,250 --> 00:08:10,380
was listening to somebody
calling it the rub, right? Not
not like as in that's the rub,
like, Yeah, but as in, it rubs
off on you in some in some
noticeable way.

32
00:08:10,650 --> 00:08:23,670
Anne Green: I'm reflecting on as
you talk. And it's such a great
memory. I mean, this is why
we're having a session coming up
soon with our own team about
networking, especially with our
younger team members. And it
kind of reminds you This is
about relationships over time,
and just staying in touch with

33
00:08:23,670 --> 00:08:39,930
people that you enjoy, and you
care about and you want to. And
as you get older and farther in
your career, you can do more to
be, you know, collaborative, but
what you're making me think of
is this industry can be very
intense, right, especially on
the agency side, but also the
client side to the corporate

34
00:08:39,930 --> 00:08:57,090
side, right. It's, it's high
performance, it's very demanding
on the agency side, it's very
competitive. It's competitive
for talent. It's competitive for
clients, right. However, I will
say that when I think about how
I came up, and that's person and
especially with Andy Cooper and
Ralph Katz, and we did lose Andy

35
00:08:57,090 --> 00:09:16,770
and 2013, to cancer, and he's an
amazing person. Ralph is doing
great, but they were indicative
of a lot of generosity of
leadership and is generosity of
spirit that I encounter in this
agency. And you and I, both
being part of page that
community, the PR Council, where
I've been a board member, you

36
00:09:16,770 --> 00:09:35,550
know, here you're in the room
with other agencies, CEOs of all
sizes. And when we joined the PR
Council in 98, Cooper Katz was
quite small. I'm now
representing, you know, a much
larger agency and with Paige as
well. But I have to say I
continue to find that spirit of
generosity everywhere, even

37
00:09:35,550 --> 00:09:52,200
amidst the competitiveness and
that shaped me to your point, it
rubbed off on me that I think
with Andy and Raul, specifically
ethics, really doing the right
thing, having an ethical lens on
how you view your work and
holding that line for yourselves
and for your client to protect
both and to bring out the better

38
00:09:52,200 --> 00:10:11,760
angels on all sides. And then
also kindness, and then also a
really high bar for doing good
work. You know, some of my
enters like, Andy, you could
tell if he was like, this is
mediocre and it's not
acceptable. And I tried to be
kind about that. But I want to
hold a high bar to. And it's a

39
00:10:11,760 --> 00:10:25,800
very interesting time to do that
as we grapple with this sort of
new technology, as you said, and
I think we'll be arrested if we
don't talk about ai ai, I think,
of course, yeah, well, we'll
come in and shut the podcast
down. But oh, but that's the
kind of stuff you make me think
of in terms of those human

40
00:10:25,800 --> 00:10:28,230
relationships, it's really quite
a generous industry.

41
00:10:28,740 --> 00:10:49,380
Daniel Nestle: Well, maybe you
know, that, that brings in
minded, probably a good place to
start the discussion about
change in the industry. Because
the things that you're talking
about, I think, are, are
timeless, and an evergreen,
that, you know, you have to have
that ethical approach kindness,

42
00:10:51,060 --> 00:11:09,270
eye for quality, or
understanding what quality
really is what's good work. And
we always say, you know, don't
let, don't let perfect be the
enemy of good at cetera. But
that's a really hard thing for
PR people to grapple with, I
found, especially when so many
of us, either were trained as

43
00:11:09,750 --> 00:11:29,340
writers, or just had that
writing skill that got us
through everything. And I never
went into the world I never, you
know, was trained in any
particular way apart from
academic, academic times,
college, etc. But I always had a
thing for writing, and I always
was good at it, especially

44
00:11:29,340 --> 00:11:53,880
editing. And I have found that
it's an and that's one of the
unchangeable and unchanging
aspects of our profession, no
matter how much we find the help
in AI, in our little interns,
and assistants, and partners
that we that I call AI, other
people might call them other
things. But no matter how good

45
00:11:54,300 --> 00:12:18,750
your large language models get
at editing things, they're never
going to get as good as the
people who understand what
people want to see and read and
who get those nuances of
language that kind of defy usage
and defy grammar and defy us
convention. So there's always a
role for for that, but it's

46
00:12:18,750 --> 00:12:19,650
critically important, I think,

47
00:12:19,650 --> 00:12:23,700
Anne Green: for us to, to
continue to be great writers

48
00:12:23,700 --> 00:12:42,150
Daniel Nestle: and great
editors, and I think even
alluded in some way to the idea
that everybody needs to be a
critical thinker, too. I mean,
you have to really examine
everything that you do. And
never take yes for an answer and
never take no for an answer this
question and question and be

49
00:12:42,150 --> 00:12:58,530
curious. And those things are
our, our I think, threads that
just going to continue to run
through the profession is
everything else around it that I
think is wildly unhinged, and,
and fraying and tying up in
different directions. But, you
know, it's, it's a,

50
00:12:59,400 --> 00:13:02,130
it's a, it is a wild profession,
and challenging.

51
00:13:02,310 --> 00:13:19,410
Anne Green: You know, I'd love
to reflect on some of what
you're saying, You reminded me
of a few things. I remember, you
know, being in their careers
when we have been, and I'm a
proud Gen F sir. Me too. But we
got to see a lot of change
happening in a compressed time.
And of course, we got to see one

52
00:13:19,410 --> 00:13:35,910
of the biggest shifts, which is
writing to the web, you know,
TypePad WordPress, the advent of
blogging, the advent of social
media, the that admits that
advent of two way communication,
which really changed the nature
of comms that command and
control was kind of lost in many
ways, which is a good thing.

53
00:13:36,960 --> 00:13:56,640
It's a wild world. But the thing
about writing that throughout
that period, I remember hearing
over and over again, there's
always that fear of like say
when Twitter launched, or x if
you if you will, this feeling
oh, it's 140 characters, My God
is going to destroy writing, or
that social media now is just

54
00:13:56,640 --> 00:14:12,780
bits and bytes. So there's no
substantive writing anymore.
There was also a big discussion,
you might remember the, I think,
was the Atlantic article, which
was titled is Google making us
stupid? You know, because our
brains skill sets are different
now than say, I mentioned before
literature from the 19th

55
00:14:12,780 --> 00:14:31,560
century, someone like Edgar
Allan Poe had an encyclopedic
memory of what he read spoke
multiple languages, and would
spend without electric light,
you know, nights with a candle
writing just voluminous volumes,
volumes and volumes of writing,
and had to store a lot of that,
you know, we just our brains

56
00:14:31,560 --> 00:14:48,660
work differently. They're very
plastic. So I think I agree with
the through line point, no
matter how often we've decried
oh, we're gonna lose this
ability, that ability or writing
is not going to matter anymore.
Writing continues to matter. So
what happened was content
continued to be critical, and

57
00:14:48,690 --> 00:15:04,560
more modes, more channels and
more types of writing short
form, long form feature
editorial, on and on and on and
then video and, and I think too,
about, you know, the modes, our
brains are plastic and will
adapt to different modes and
channels of information
gathering. So maybe we don't

58
00:15:04,560 --> 00:15:20,820
need to retain as much, but
we'll access it in different
ways. I think so the other
through line is that there's
always been side by side with a
need for writing and critical
thinking, always a worry that
people could work harder at
their writing. And I do think
now, and we can talk about it

59
00:15:20,820 --> 00:15:38,310
now or put a pin in it for later
that I think we'd all agree the
internet is flooded with a lot
of garbage. I hope our industry
tries to minimize our role in
that. But I know that marketing
comms plays a role in pumping
stuff out, right. So whether
it's paid, or and whatever, but
they're, I do think back to your

60
00:15:38,310 --> 00:15:55,680
point, that human intervention
of writing that is precise, or
that sparkles, or that has
energy or that's alive, or that
leaps off the page, you know, I
try to say to my peers, and
especially younger ones, I want
to see writing that Selectric,
I'd like to see it be alive,
even if it's a marketing pitch

61
00:15:55,680 --> 00:16:13,110
or something, or a deck for new
business or whatever. So I think
that and then the critical
thinking to say, what is
something that's really actually
interesting right now, what is
thought leadership versus just a
title that is not really thought
leadership at all? So you're
making me think about a lot of

62
00:16:13,110 --> 00:16:14,670
things at one time? Well, that's

63
00:16:14,670 --> 00:16:28,830
Daniel Nestle: good, because we
have to think about a lot of
things at one time. That's the
nature of our of our profession,
actually, and I think I think a
lot of our listeners don't
always understand that, what
they don't understand a lot of
people won't understand PR,
apart from all those people that

64
00:16:28,830 --> 00:16:50,100
spin things are, those are the
people that, you know, represent
famous people as a publicist, or
the you know, those are just
tiny corners of what we do. And,
you know, the idea of equate and
just if I could step back a
little bit, the idea of equating
doing the good job with spin
absolutely just just infuriates

65
00:16:50,100 --> 00:17:10,380
me. Yeah. You know, there's a
new AI tool out called Sumo. I
don't know if you've seen it
yet, as sumo AI. It is. It's
hysterical. I mean, it's, it's,
it's, it's in the gimmicky
stage, I think right now. But
with a prompt, you can, it makes
a song, simple prompt, it just
creates a song and the song are

66
00:17:10,710 --> 00:17:34,440
ridiculously good. Like, you
know, it takes just a few
seconds, and you have, you know,
in the free version, a minute
and a half of, of a song and any
style that you record that you
put in there, and with lyrics,
and, and vocals and everything.
So as a kind of, I don't know,
either a joke or a moment of

67
00:17:34,710 --> 00:17:37,020
utter internal clarity. I,

68
00:17:38,580 --> 00:17:43,320
I was experimenting with it the
other day, and I prompted it
with

69
00:17:44,820 --> 00:18:08,550
you know, write a blues song
about a PR communications
professional executive, who is
tired of waking up every day,
and having to spin the truth,
and just wants to be a straight
talker for a change. And it came
up with this with this with this
song called Shadows of
deception. Oh my god. It's just

70
00:18:08,580 --> 00:18:28,320
it is great. I will share it
with you later and remind me,
but I was I was astounded
because that's the prompt that I
gave it. And it came up with,
you know, three verses of lyrics
that, you know, on a bad day,
that's my, that's my life
sometimes, like, I'm not saying
that I have to spin for

71
00:18:28,320 --> 00:18:47,520
elliptical at all, but, but I
am, you know, we're all kind of
pressured sometimes to, to just
make things sound like they
don't suppose to sound or, you
know, but take the stink off the
off the rows or whatever,
whatever metaphor you want to
choose. That's, and it's
infuriating. So, you know, I

72
00:18:47,520 --> 00:18:48,960
think our profession and
certainly

73
00:18:49,950 --> 00:18:51,300
the kinds of things that we

74
00:18:52,590 --> 00:19:12,180
the kind of role that we have to
play with for more and more is
that of arbiter of what's good
and what's not. And, you know,
apply that critical thinking to
say, look, let's, let's ditch
the spin. And let's, you know,
apply the principles of quality
and good writing and, and truth
I suppose, to what we're doing

75
00:19:12,180 --> 00:19:22,260
and, you know, I think where I
was going with this was that
technology and the all the crap
that's out there that you
mentioned, you didn't say those
words, but oh, I

76
00:19:22,260 --> 00:19:25,530
Anne Green: might have I think I
didn't It's it's it is

77
00:19:25,560 --> 00:19:28,140
Daniel Nestle: it is what it is
right? It's garbage. It's crap.
It's

78
00:19:28,230 --> 00:19:33,360
Anne Green: there's terrible
content downloaded with a lot of
content that's just it doesn't
matter. It doesn't get better,

79
00:19:33,600 --> 00:19:34,590
Daniel Nestle: certainly doesn't
get better.

80
00:19:35,790 --> 00:19:44,160
You know, we we are uniquely
qualified to both be the guard
against that,

81
00:19:44,760 --> 00:19:57,900
but also to be corrupted into
making that kind of crap. You
know what I mean? Like there are
people who will just take the
cash and just make that nonsense
and that's a that's a problem
for for marketers and
communicators out there who want
to do good work. Yeah,

82
00:19:57,900 --> 00:20:10,980
Anne Green: it's an inch. I
think you're making an
interesting way, one thing you
remind me of years ago, and I
could never remember the source
was, but I remember seeing some
research that showed that there
was probably a higher degree of
ethical awareness in this field
and in many others, because

83
00:20:11,580 --> 00:20:25,170
we're on the front lines,
especially on the agency side,
but in the corporate side to
have representing things to
other stakeholders. on the
agency side, we have contracts
that have you know, dual
indemnification, because that
which is really legally
important, because we're

84
00:20:25,170 --> 00:20:44,010
representing information on
behalf of a company like yours,
and, and that and they're giving
us information, and we're both
kind of liable for that. So I
think that there's a lot of
misperceptions about what this
industry is about. But like
anything in human life, there
are the two sides of it. You

85
00:20:44,010 --> 00:21:02,910
know, there's the angels and the
bad actors. And I think that the
vast majority of our crew in
this field, and I've met 1000s
of people now, as you have just
so many amazing people of all
ages, there is really a highest
standard and a lot of deep
thought about what is happening
with the media. What is what is

86
00:21:02,910 --> 00:21:18,210
the role of a media in a
democracy, a big messy media
landscape? What is the role of
paid channels versus own
channels, what kind of content
we're putting out there, what is
hitting our stakeholders, and I
do very, very much agree with
you. And you and I've talked
about this offline before this

87
00:21:18,210 --> 00:21:36,390
podcast that I don't know if our
industry reflects enough on how
well suited we are to grapple
very intentionally and
optimistically, but also
thoughtfully and with a lot of
control over these new
technology tools, for sure. And
that we part of that is really
bringing an ethical lens to it

88
00:21:36,390 --> 00:21:51,480
to ask ourselves, like, how are
they being used and to be super,
super self educating? And
educating individually and as an
industry on? How are they
changing? And what is happening,
what's the structure behind it,
and all that good stuff, and
also drawing a line for
ourselves and for others that

89
00:21:51,480 --> 00:21:53,790
we're working with? You know,
yeah.

90
00:21:55,110 --> 00:21:55,860
Daniel Nestle: And

91
00:21:57,060 --> 00:22:14,670
the way that we comport
ourselves in the, the work that
we produce, is our way to show
that we can do these things.
Yes. And we just only hope that
either we have the persuasive
capabilities, the influence
capabilities, probably more
influence than persuasion, I
would think, but we have those

92
00:22:14,670 --> 00:22:32,460
capabilities, to build those
relationships with our
stakeholders, and both
internally, and if you're an
agency with your clients, you
know, that would garner the
trust that you need for them to
start to respect your capability
to contribute in that way, which
is, which is a very long slog,

93
00:22:32,490 --> 00:22:50,520
and in many cases, it's a long,
it's a long haul. And, you know,
that's one of the things about
our profession that I think
other, you know, other functions
don't necessarily need to
contend with. To that degree,
you know, if you're, if you're
excellent at finance, you're
excellent at finance is pretty

94
00:22:50,520 --> 00:23:09,120
clear. You know, but if you're a
CFO, and you need to create
insights, for your business out
of finance, that's a different
story, right? You, then you're,
it's not just about how good you
are with numbers and tax code
and all those other things, then
it becomes more about, wow, what
interesting strategic directions

95
00:23:09,120 --> 00:23:27,900
Have you suggested for the
company and and I think that's
the same thing with us, you
know, in many ways, any, anybody
coming out of college can, you
know, theoretically, edit or
write a document or learn how to
do a press release the basics,
right, summarize the media,
which is another thing we're

96
00:23:27,900 --> 00:23:51,660
going to need to talk about,
because who knows if that's
going to be necessary as much
anymore. But, you know, to rise
up as you have through through
all the levels of client service
of internal and internal
management structure, you know,
takes a much broader, and I
think more, you know, I think, I

97
00:23:51,660 --> 00:24:15,570
think a more widely applicable
skill set in many ways. And
that's part of the change, I
think, facing our profession
like we we used to be, I think
that communications or PR and gr
and AR like all these ours were
that kind of pigeon holed into
certain things that you do. Now,
those guardrails, the silos, the

98
00:24:15,570 --> 00:24:17,460
cages, whatever you want to call
them are

99
00:24:18,330 --> 00:24:18,930
permeable

100
00:24:20,670 --> 00:24:26,820
or breaking down. And some
people are not taking advantage
of it. But I think a lot of us
really see

101
00:24:26,820 --> 00:24:29,730
that. And, you know,

102
00:24:30,330 --> 00:24:50,460
that is one of the things that's
powering the change in the
profession is that we, as
professionals, want to do more
to help our, our businesses,
because we think we can, we want
to help ourselves, our careers,
we want to expand our
capabilities, and we want to
learn because, you know, so many

103
00:24:50,460 --> 00:25:09,000
of us are just habitual learners
and very, very curious people.
Yes. So all those things
together, you know, are are
powering the end engine of
change. And we as Gen Xers, you
know, we've been through a
couple of iterations already.
And maybe that also uniquely
qualifies us to deal with the

104
00:25:09,000 --> 00:25:12,120
next one or this current one,
which may be the biggest one
ever.

105
00:25:13,380 --> 00:25:17,220
And that, of course, is
artificial intelligence, Gen AI,

106
00:25:17,640 --> 00:25:41,130
especially with respect to, you
know, to comms into marketing.
So, as we look at a Gen AI and
and what it means, I mean, what
a broad question. Yeah, you
know, let's, let's just start
with some high level thoughts
there. I mean, you know, you
talked about different modes and
channels of, of accessing

107
00:25:41,130 --> 00:25:42,030
information.

108
00:25:43,020 --> 00:25:45,930
talked about, we both talked
about

109
00:25:46,560 --> 00:25:55,530
writing capabilities and
critical thinking, you know, so
from the get go, Jen AI is, is
changing the way we approach
these things and do these
things.

110
00:25:57,120 --> 00:25:58,680
What are you seeing, as,

111
00:25:59,070 --> 00:26:12,570
you know, from the agency side,
and as somebody who has broad
view of so many businesses and
types of businesses and clients,
and partners within the
industry? You know, what, what
are like the top things that you
would kind of want to pick on?

112
00:26:12,900 --> 00:26:29,160
Anne Green: Yeah, there's so
much there. It's so true. And
it's, it's an adoption curve,
for sure. But it's a moving
target. So we've been here
before, it's also an arms race
on the tech side, so many new
platforms. I'm not even talking
about like the big gorillas, and
you know, which new iteration of

113
00:26:29,160 --> 00:26:46,710
Claude or GPT, chat GPT, etc.
And what other big ones are
Gemini, you know, now, as you
evolve from the Google platform,
I'm talking about, you know,
what's being created on smaller
levels? And also what's drafting
off of those big guys, like you
said, there's like a million new
things every day, right? So

114
00:26:47,400 --> 00:27:03,000
there's a whole lot of ways
we're racing at one time. And
it's this is true in
corporations and on the agency
side, and the agency side is
particularly acute because we
feel that responsibility to stay
ahead of the curve, and to
really be those counselors,
right? And you don't, there's

115
00:27:03,000 --> 00:27:18,600
certain things they say fake it
until you can make it that's
just not, that's just not
acceptable or real. In in this
kind of situation. So I think
that there's a lot of normal
stuff we're sorting through,
which we saw before in the
social media era, and the web
three era, which is like, what

116
00:27:18,600 --> 00:27:33,990
tools and platforms do you need?
What's their cost structure?
It's changing every day, there's
7 million of them and who's
testing what, who's building
what. But from my perspective,
I'm really looking at it at
three levels organizationally,
for us. And then, you know, for
the partners, we haven't really

117
00:27:33,990 --> 00:27:50,340
the whole industry, because I'm
having these conversations all
over the place on the ground
level. Meaning what is the
tools? How is it proliferating
AI capabilities and tools we're
already using? Because the
enterprise landscape of software
and SAS, that that's where it is
ground zero? For a lot of us on

118
00:27:50,340 --> 00:28:05,460
a business level? And also a
personal level to how is it
being baked into our, you know,
consumer tools? And then what
new platforms do we need to
test? And then what training to
folks need? You know, and I like
training all the way from, you
know, obviously, prompt
engineering, but also how is it

119
00:28:05,460 --> 00:28:22,170
evolving, but then use case has
been? And also like, how do we
think about the ethics side of
it, and IP and bias and etc? The
middle level to me becomes,
that's the ground level, the mid
level is about intentional
engagement and execution and
pilots. What are the kinds of
pilot programs we're running

120
00:28:22,170 --> 00:28:38,070
internally? What are the kinds
of tests we're going to run very
intentionally with clients to
see where is the value? Meaning,
maybe you want more efficiency?
Maybe you want to automate,
automate or eliminate rote
tasks? You know, none of us are
pasting clips into a book, like
those of you in PR would know

121
00:28:38,070 --> 00:28:54,360
that from years ago. We're not
doing that stuff anymore. Do we
don't need to do that stuff
anymore. So or where are there
higher value things around
personas and content generation
that isn't garbage? That's
actually useful. And then the
top level for me, is where I
really want I invite everybody,

122
00:28:54,360 --> 00:29:12,840
it doesn't matter whether an
intern here are my partners and
ownership, or your CEO or
someone in school, I want to
invite people to think about big
picture questions like what will
it mean to be human? What is
authorship mean? What does it
mean to have co intelligence or
augmented intellect? What does

123
00:29:12,840 --> 00:29:33,510
it mean to treat an AI like a
companion in that way? I think
AIs are gonna be companions in
many ways. You know, this is a
wild world, and it'd be very
human and very alien at the same
time. So I feel it's incumbent
upon us all to think ground
level, mid level and high level
and that can also be conceived

124
00:29:33,510 --> 00:29:52,320
as immediate, short, near term
and far term, because none of us
have a crystal ball. But one
thing I will say is that I've
been around for a lot of
prognostication where it's like
agencies are dead, the press
release is dead. This is dead,
that is dead. And usually, and
it's also not true that

125
00:29:52,320 --> 00:30:02,760
nothing's going to change. And I
think we've got to be in that
moment. Now to to say like, I'm
going to resist the extremes but
also be very open to some very
radical change

126
00:30:03,630 --> 00:30:14,130
Daniel Nestle: the tent, I think
the tendency of humans is to
just immediately assume it's a
zero sum game. You have things
behind us that things that are
the way they are in our going to
be the way they

127
00:30:14,130 --> 00:30:18,090
Anne Green: always are. I hate
there's some thinking, and boy,
do we do it all the time. We do
it all the

128
00:30:18,090 --> 00:30:33,390
Daniel Nestle: time. And it's
the source of so much. So much
strife, yes. Because you know,
by that, by that philosophy, or
by that measure, if I have
something, then you don't have
it. And it's very dangerous, you
know,

129
00:30:34,050 --> 00:30:46,980
Anne Green: and by the way, I
just, I can't cosign on that
enough. And it's dangerous
everywhere. It's dangerous.
between two people. It's
dangerous in organizations, it's
dangerous in a society 100. And
it's, it's something we have to
resist. Yeah.

130
00:30:47,280 --> 00:30:56,250
Daniel Nestle: And I go on,
sometimes about abundance,
thinking and all that kind of
stuff. People talk about fixed
mindset versus learning mindset.
It's all one part of the same,

131
00:30:57,540 --> 00:31:00,780
you know, pie, I suppose. But

132
00:31:02,400 --> 00:31:08,400
to get out of our own way, and
to stop navel gazing and sort of
say, okay, yeah, we're going to
change.

133
00:31:09,990 --> 00:31:11,820
But, you know,

134
00:31:12,270 --> 00:31:16,920
the profession, as we know, it
is probably going to be a
different profession several
years from now, but it's still
our profession.

135
00:31:17,970 --> 00:31:20,550
And yes, I,

136
00:31:21,090 --> 00:31:38,910
I everything for everything I
read, and all the wonderful
people I speak to, and, you
know, being immersed in, in AI,
especially as I am, there's no
question changes could happen
very, very fast. But I don't
think it's going to happen that
fast, where everybody is
suddenly going to be out on the

137
00:31:38,910 --> 00:31:58,260
street. I mean, that's not the
way it is, you know, Gary Vee,
who I, by the way, don't enjoy
very much. Occasionally, he says
things that are actually good or
true personality, his
personality. Yeah, I'm not I'm
not a follower. But, you know,
he says that, you know, if
you're not using AI, at least 20

138
00:31:58,260 --> 00:32:13,620
or 30, or whatever, an hour a
day, then you are going to be
dead in the water? I don't, I
wouldn't go that far. But I do
think that in our profession, if
you are not getting familiar
with AI, with generative AI,
especially if you're not
experimenting and

139
00:32:14,730 --> 00:32:16,050
asking, What if

140
00:32:16,050 --> 00:32:41,280
I do this, what if What if all
the time, then you will be left
behind, or you'll be relegated
to kind of just trying to figure
out how to edit the next press
release, or whatever it is,
right? So there is going to be a
difference between those who the
I don't even wanna say embrace,
but those who adopt and see the

141
00:32:41,280 --> 00:32:46,800
way to use AI for themselves and
for their organization, and so
on. And those who don't.

142
00:32:48,630 --> 00:32:51,330
But back to your three, your
three points,

143
00:32:51,750 --> 00:32:59,640
three levels, your ground level,
intentional engagement. And I'll
call it philosophy,
philosophical level, those three
things.

144
00:33:00,780 --> 00:33:03,750
It seems like, you know, it's
not just

145
00:33:03,780 --> 00:33:11,520
midterm short term, mid short,
mid long term, although it's
helpful to think of it that way.
But I think that people are in
all three of these places at
once, in many cases.

146
00:33:13,260 --> 00:33:24,270
And it really depends on you
know, what you're trying to
accomplish with that particular
AI are that task? In our case,
or in

147
00:33:24,900 --> 00:33:40,770
my personal case, I'm probably
all three, like, you know, I
have I have discussions all the
time about this, what does it
mean to be human? Oh, yes. And
it's such a critical part of
this, and we should get back to
that shortly. But I spend most
of my time in the ground level,
right, most of the time is like,

148
00:33:41,250 --> 00:33:48,870
training people or understanding
the tools, you know, and just
understanding the enterprise
landscape and trying to within
my company anyway,

149
00:33:50,340 --> 00:33:53,970
spread the word educate,
advocate for

150
00:33:54,510 --> 00:34:13,080
a suite of the suite of tools
that our IT team has, has, has
created, which is really, I will
say, such an advancement over so
many other companies, like some
companies doing nothing, and,
you know, we have a killer team
in Japan that's been doing
whatever they can to provide

151
00:34:13,500 --> 00:34:17,550
access to, you know, to this
game changing

152
00:34:17,850 --> 00:34:34,500
AI and AI, I keep saying
technology, but it's Game
Changing, changing entity thing,
whatever, of AI that, you know,
because I think they recognize
that, you know, the AI enabled
business is the one that is
going to win, or it's the
minimum it's going to be table
stakes for being able to

153
00:34:34,500 --> 00:34:41,370
compete, you know. So I find
myself in that at that ground
level a lot and just playing
around.

154
00:34:42,570 --> 00:34:44,460
But it's hard to get, it's hard
to get the word out.

155
00:34:44,940 --> 00:35:00,900
Anne Green: Here. You know,
there's there's a lot of things
I've been thinking about as
you're talking. You used to joke
that no one had an existential
crisis over a Microsoft Excel
spreadsheet because it's a
business tool and it's real cut
and dry. Social media really
collapse the price In public

156
00:35:00,960 --> 00:35:19,170
domain, the personal and
business domain, it really was a
collapse of that divide. And it
trapped with a lot of other
trends in society, and
technology and business and
life. That changed the nature of
that that intermingling that we,
we use the word hybrid today in
different ways. But, and I think

157
00:35:19,170 --> 00:35:35,670
that this tool, social media was
challenging in that way, it
continues to be challenging
generationally, people may be
used to it, but they still have
a lot of feelings about it. And
it's complicated, especially as
they've seen the impact of
algorithms over time, the kind
of unintended consequence that

158
00:35:35,670 --> 00:35:51,720
we sort of wandered into Jaron
Lanier and others were talking
about it for a long time. But I
think now that this is even more
complicated in terms of how
people feel about it and relate
to it, there's going to be a
very diverse adoption curve.
Like anything, people have to
find their killer app, the first

159
00:35:51,720 --> 00:36:08,940
time I heard that phrase, which
was more in the.com, boom, it
reminded me of the first time I
saw spell check on the early
personal computers in the 80s.
And I realized for the first
time, why I might want to learn
to use a computer. Before that
time was like, I don't know if I
need this. Yeah, I saw spell

160
00:36:08,940 --> 00:36:27,330
checking. I'm like, damn, I need
this. And the idea of the killer
app, finding those killer apps,
those moments that is your
personal entree, so it's like,
there's the business tool, okay,
I could learn to use it and
irrationally, understand it, and
how might AI augment me and, you
know, and, and make things

161
00:36:27,330 --> 00:36:45,150
efficient. And this is how this
works. And I'm playing with
this. But where's that moment
where you cross the Rubicon into
a personal relationship with
like, this technology is truly
useful to me, and wow, it's
lighting me up. Right. And so
you and I both listen to the
Ezra Klein podcast, he's doing

162
00:36:45,150 --> 00:37:00,840
some incredible deep dives into
this and speaking with people
like Ethan Moloch. And you know,
Ethan was saying that you really
need to spend 10 hours with it.
And part of it is not just like
treating it, like, I will put in
question you give answer, right.
It's more that discursive
process of building and learning

163
00:37:00,840 --> 00:37:21,150
and also training these tools to
manifest and a personality or a
set of parameters that you can
relate to or that relate to the
business need, or the personal
need. This is a radically
different shift. And I think for
me, as I contemplate this, and
like you'd get, try to get as
deep as I can, and also try to

164
00:37:21,150 --> 00:37:37,380
push myself to be hands on where
I am also finding barriers to be
like, how am I using this today?
I'm not you know what I mean?
But I think that I'm trying to
also something you said earlier,
project myself forward, and say,
What is this future state gonna
look like? I'm standing in this
agency in this physical space.

165
00:37:37,650 --> 00:37:59,160
What is it look like? What are
we doing? What are we not doing?
What was hard? What was easy, I
think, project back to the
adoption curve for web one, web
two and social. And I can see
what those shifts ended up being
I can see how hiring change. We
don't look that different, but
we are profoundly different. So

166
00:37:59,670 --> 00:38:16,230
I am very interested in this
question. And I want to keep
interrogating, because I'm not
100%. Sure. Like I was just
having a discussion with a few
agency CEOs yesterday, like,
will our hiring change? Will the
entry level look different? Like
the these are big questions of
how do we project forward? Where

167
00:38:16,260 --> 00:38:23,700
the pace of change is so
massive? That it's hard? And
it's and we don't really have an
analogue for it? If that makes
sense? Oh,

168
00:38:23,700 --> 00:38:40,230
Daniel Nestle: it makes perfect
sense, at least when it comes to
things like web one and web two
and social, you know, it's a
flat screen it's, it's, it's a
one way of what's going from one
way communication to a
communication, oh, well, people
can comment and know this, and
this great Oh, my God, isn't

169
00:38:40,230 --> 00:38:57,600
this terrible, like, it's this
whole we've been in it, we've
been we've been dealing with it,
we're still in most people are
still in web two. And just
dealing with that, and we're
seeing the consequences in so
many ways. And, you know, it, it
took on a life of its own, and,
in many ways gave technology as

170
00:38:57,600 --> 00:39:16,980
a whole, a very bad rap and a
high level of suspicion, and
rightly so, brands, people
should be suspicious. And should
or should approach with extreme
caution. Anything that there
anything new that they're doing,
and bias by extreme caution, I
mean, protect your personal
information. I mean, you know,

171
00:39:16,980 --> 00:39:34,410
like, do the steps that you need
to do now in this world of ours,
you know, it's, you know,
imagine that every time you you
log in to any tool, it's like
going through airport security,
right? You got to take shoes
with less you have that maybe
you have you have your your
facial recognition, which is

172
00:39:34,410 --> 00:39:50,820
your TSA Pre and you don't take
your shoes off, but generally
speaking, you got to go through
this process. And then you got
to change it up all the time.
Because, you know, bad people
get smarter too. And that's,
that's just the way it is. Now
add AI to this. And it's a whole
new level of stuff. People

173
00:39:50,820 --> 00:40:06,540
aren't sure whether to treat it
as a layer of technology that
requires the same type of
thinking, or something
completely different. No, of
course You and I both think it's
something, it's both of those
things. But it's also something
very, very, very different. We
also don't know, it's a black

174
00:40:06,540 --> 00:40:15,240
box, right? We keep hearing that
it's black box, black box, even
more, like you mentioned before,
is doing incredible work in
trying to figure out what the
heck is in that black box.

175
00:40:15,510 --> 00:40:15,990
Anne Green: I love it.

176
00:40:15,990 --> 00:40:26,820
Daniel Nestle: I know, it's so
interesting, fascinating. And 10
hours is nothing on this, you
can, you know, it just goes by.
But getting back to something
that you said and and, you know,

177
00:40:27,930 --> 00:40:35,760
how I think we need to approach
it even, even at the beginning,
is you said

178
00:40:35,760 --> 00:40:53,970
something, and I'm laughing
because I, I include it when I
talk about AI with people who
don't understand AI or when I
train people in my organization.
You know, one of the first
things I say is you have to
learn a new way of speaking,
that isn't any new way of
speaking at all right? So you

179
00:40:53,970 --> 00:41:05,730
have to learn, you have to get
past this cognitive dissonance,
that everything is Google, like
you ask a question, you get an
answer. Oh, thank you by done

180
00:41:06,300 --> 00:41:10,350
Anne Green: like that
transactional neutral
transaction, exactly. In

181
00:41:10,950 --> 00:41:31,290
Daniel Nestle: the case of
generative AI, as you know, and
and, you know, many of the large
language models that are out
there, you need to approach it
as if you are about to have a an
interaction, call it a
conversation, call it an
interrogation, whatever it is,
you know, that, that you will

182
00:41:31,290 --> 00:41:49,740
get more out of it. When you're
able to refine the things that
you are saying and you're
thinking, even along the way?
Yes. You know, when you go to
Google with a question, you have
the question, I have questions.
Here's my answer. When you go
to, you know, chat, GPT or
Claude are, you know, and I love

183
00:41:49,740 --> 00:42:07,020
Claude these days, by the way,
but when you go to church to
your Claude, you know, you may
have a question. But
understanding that, you know,
it's you're walking up to,
you're essentially walking up to
something that has access to all
of the information that has ever
been

184
00:42:07,830 --> 00:42:34,350
digitized. Yeah, and, you know,
give or take. And if I was going
to ask you an you know, tell me
about an agency. Now you have
your agency experience, you
could you could talk about what
agencies are from this, from
your, your experience at Burson
and then at Cooper Katz, and now
at the GNS for communications

185
00:42:34,350 --> 00:42:39,900
and for business communications,
right. But you'll tell me about
an agency? And I'll say, okay,
great.

186
00:42:41,220 --> 00:42:55,950
But really what I really want to
knows, actually, what about an
advertising agency or marketing
agency? Or maybe what I really
want to know, is something
completely different. So if it
was Google, I would ask question
come back. But with AI, I would
approach and then they say,
Well, no, no, that's not what I

187
00:42:55,950 --> 00:43:15,360
mean. Let me refine my question.
Let me talk to you a little bit
more. And we'd have a
conversation. So I kind of got
to the bottom, the bottom bottom
line of, of what your knowledge
is now with, with AI? And what
kind of questions I should be
asking you, right? With AI, you
know, you, you can have a lot of

188
00:43:15,390 --> 00:43:35,550
thinking beforehand. And sort
of, you know, understand your
context. And instruct this thing
that has billions and billions
and billions of bytes of data,
whatever. I don't know what the
mountain terabytes, you know,
gigabyte, I don't know. But all
of this data, and you have to
realize that you're asking it a

189
00:43:35,550 --> 00:43:49,260
simple question, when it has all
the complexity in the world to
search through to get to that
simple question. So you need to
give it a little bit of a
guideline, right, so that you're
not wasting time and go back and
forth and iterate it reminds

190
00:43:49,260 --> 00:44:09,630
Anne Green: me it is a new old
language, because it's more
human language. But that's too
simplistic and also complicated
at the same time. But yeah, it
reminds me of the early days of
search. We had to be trained and
Google helped to refine this
training of how to do search
now. You went from Ask Jeeves

191
00:44:09,660 --> 00:44:27,990
for you know, our, you know, our
older folks out there remember
Ask Jeeves, right? Where people
would write very long questions,
because that's you came to it
like human that hadn't been
trained in search yet. Then we
got trained to get really small
and precise about like just a
phrase. And we got really good

192
00:44:27,990 --> 00:44:45,240
at it. People are very good at
search today. Right? Very, very
good. And then we realize, oh,
Google's super smart. It's
reading for meaning, you know,
this is the early days of AI
it's able to understand so our
queries got longer and we all
know the stats that most
searches like are pretty unique.

193
00:44:45,720 --> 00:45:03,240
They're quite distinct from each
other. I forget the exact stats
but to go into it more in a
conversational mode. A real back
and forth like you're doing an
interview and interrogation
reminds me immediate training.
This is not interrogation but to
go through it with full,
complete sentences and also

194
00:45:03,240 --> 00:45:24,240
paragraphs and say, as you said,
Claude, that wasn't my
intention, I want you to rethink
about this in a new context.
What if I was asking about this
area of inquiry, then also,
don't forget, I want you to add
this layer to it in this context
to it. We're not trained to
interact that way with machines.

195
00:45:24,240 --> 00:45:41,370
We've been trained to interact
with humans that way. So, right,
to me, it goes back to that
plasticity of the brain, how
quickly we adapt. That's right.
We have to confront change and
how resilient and quick we are.
When we find those killer apps,
especially suddenly, everybody's
behavior changes. I mean, the

196
00:45:41,370 --> 00:45:55,320
second that Google Maps launched
with traffic, everyone's like,
now I need a cell phone, you
know, I need an iPhone for this.
So but it's what you're saying
is very key to how we are
changing now and are going to
have to change and how hard is
that learning curve?

197
00:45:55,680 --> 00:46:14,160
Daniel Nestle: Yeah, it's a it's
a learning curve. It's a
mindset. And it's, it's like we
said, it's, again, this kind of
cognitive dissonance, where I'm
looking at a computer. So I have
to behave a certain way. Yeah.
But you're telling me that all I
need to do is like, use my
normal natural language and

198
00:46:14,160 --> 00:46:28,950
something's going to happen.
What that what, what's that
about? With, with people who are
getting the most out of Jenai,
especially, it's, you know,
they're getting the most out of
it, because they, they think
about all the different things
that they want to say or that
they want, like all the

199
00:46:28,950 --> 00:46:43,770
different different permutations
or directions that this could
possibly go in there. No, I
don't want this to go in that
direction. I don't wanna go in
that. I have it like, I have an
idea. And I want to see if it's
going to work. I'm going to give
all the information I possibly
can. And then I'm going to

200
00:46:43,770 --> 00:46:45,840
remember Oh, yeah, I think

201
00:46:46,440 --> 00:46:52,230
it would be useful. If, you
know, if the AI had

202
00:46:52,770 --> 00:47:07,710
this other information to look
at and oh, maybe it would be
useful if it had this data to
look at you know, and then you
can start to add more and more
in and, and people have to
understand this multimodal thing
where you can you can use
different types of media, and
you can talk about different

203
00:47:07,710 --> 00:47:25,140
kinds of things. And such a
leap. It's such a, it's such a
jump from from where we are. And
sometimes it's it's easy to
overlook that. Or he's easy to
forget that for those of us
who've been in, you know, deeply
because we love this kind of
stuff. Or we know it's we know
it's necessary, you know that

204
00:47:25,140 --> 00:47:44,160
the vast majority of people
still are in the question
answered to buy mode. And this
is, I think the very first thing
that needs to really happen in
order for folks to find that
killer app, you know, AI, the
other great thing about AI is,
or dangerous thing I don't it
depends on how you look at it is

205
00:47:44,850 --> 00:48:05,190
the killer app is you and your
thinking. So at the at the most
basic level, you hear about
things, people creating prompts,
just on their own, a great
prompt, that they then cut and
paste and put someplace else and
hold on to it, because they're
gonna use it again and again and
again, in order to, you know,

206
00:48:05,220 --> 00:48:18,780
create something that they need
to create again, and again and
again. And guess what you've
just done. You've just, you've
just coded a program is what
you've done, you've coded
congratulations, but you've just
been able to do it with your
natural language. So it's this
making these connections to what

207
00:48:18,780 --> 00:48:32,670
you're actually able to do.
You've created an app, every
prompt you make is an app. And
essentially, you know if you can
repeat it if you can use it
repeatedly. So the the way that
your mind has to kind of expand
I'd say or

208
00:48:33,120 --> 00:48:34,050
just drop

209
00:48:35,130 --> 00:48:47,820
conventional approaches
interpret conventional wisdom.
It's more like that, again,
going back to Carol Dweck, the
learning mindset, you know, you
have to be flexible when you
kind of approach these things
and adapt her Yeah, gone.

210
00:48:48,180 --> 00:49:03,570
Anne Green: Oh, no, I think that
part of it is that it brings me
back to you know, what is you
know, as an organizational
leader, and as a communicator
that has been working for this
long and as a counselor and
still actively counseling
clients and still actively
training you know, I love media

211
00:49:03,570 --> 00:49:24,000
training and presentation skills
training and you know, I'm still
actively doing my craft as well
as leading. You know, part of it
is inviting folks to really
think differently so that they
can understand how to engage in
new ways and that's whoever I
run into, or you run into and so
myself and then pushing myself

212
00:49:24,000 --> 00:49:38,250
to get uncomfortable. It's an
uncomfortable moment. This kind
of change is like really
uncomfortable, especially when
you're hearing the
prognosticators and bless Gary V
for telling us all we're going
to just be right off the ship
you know if I get it right, I've
we've heard that kind of stuff

213
00:49:38,250 --> 00:49:54,510
for years. And that's the kind
of stuff that I don't have a lot
of patience for. But I get why
we need the kick in the butt. I
get it for sure. And I take it
for what it's worth, you know, I
use it for fuel. But to me,
that's where also bringing it
back down to that ground level,
which is what makes it real.

214
00:49:54,840 --> 00:50:10,650
What makes it useful. What
brings us to value my another
one of the things I've been
Seeing leaders work smarter, not
harder. How do we do that? And
how do we make sure that,
especially if time on the agency
side of the time we're being
paid for how do we make sure
that time is being used the most

215
00:50:10,650 --> 00:50:27,750
valuable way? One of my big
mantras to my team this year, is
well, first of all, how do we
show our clients, we're an
essential partner every day,
because that's not something you
can take for granted. But the
bigger part of that is the
Northstar is how do we know what
is valuable to that organization

216
00:50:27,780 --> 00:50:44,610
or those individuals on a given
day? What is valuable, and I'm
not talking about value in terms
of money, like what's actually
valuable to be doing and
executing. And this is making us
interrogate all of that. And I'm
seeing on the ground level, like
many folks, our creative team is
all over this, because there's

217
00:50:44,610 --> 00:51:01,980
been a proliferation of new
capabilities. And some of the
core enterprise tools like Adobe
and Getty that are amazing. And
it's helping them do stuff that
was the equivalent of the PR
person pasting clips into a book
and measuring column inches,
masking out stuff and rebuilding
backgrounds in seconds. And, and

218
00:51:01,980 --> 00:51:16,590
then in our areas, which are
highly, highly specialized
things like agriculture, they
can have their eyes on a field
and know if it's correct or not.
And that's really detailed
knowledge. So that's one and
then the analytics teams, how
they're using it. So it's been
really interesting to me to

219
00:51:16,590 --> 00:51:32,280
watch. Yeah, where have the
obvious use cases taken? Hold?
Where is that adoption? Where
are people like, Guys look at
this, look at this, look at this
and within the guidelines and
the guardrails we've created?
And then where is it that people
are like, Wow, I feel like it
should be useful here. But I'm

220
00:51:32,280 --> 00:51:37,230
trying to fight for it. So that
that, to me is gonna be
something I'm watching really
carefully going forward. Yeah,

221
00:51:37,290 --> 00:51:55,290
Daniel Nestle: yeah, me too.
And, you know, you're making me
think of something that my
mentor and friend, and
occasional co collaborator, Mark
Schaefer has been saying, part
of me lately, we keep talking
about, okay, how learning how to
talk to AI, learning how to
speak with AI, learning this

222
00:51:55,290 --> 00:52:10,260
kind of new language, and I
think that's a very, very
important and critical thing.
But he just mentioned that, you
know, it's an Adobe Suite now,
and it's, you know, if you go on
LinkedIn, if you I'm not sure if
it's for all LinkedIn members,
but it's a you can, you can
check, you can create comments

223
00:52:10,260 --> 00:52:17,100
based on, you know, with by with
a click on AI. So AI will come
to you, you don't have to do
anything, right,

224
00:52:17,130 --> 00:52:21,960
it's coming to you. So
understanding, like,

225
00:52:21,960 --> 00:52:37,980
the vague, the kind of
complexities of prompting may
not be necessary for everyone, I
understand that. But kind of
understanding the way that that
actually works, will help you
understand how to take the best
advantage of what's going on
inside, you know, Adobe Suite,
or what's going on inside

226
00:52:37,980 --> 00:53:00,750
Microsoft, office with copilot,
you know, these are, these are,
these are all connected, you
know, the, the, the whole thing
kind of brings to mind, really,
and I think this is, you know,
where we where we're gonna go in
a upcoming webinar that we're
doing together. But let's, let's
talk about here for a second.

227
00:53:01,620 --> 00:53:02,100
You know,

228
00:53:03,480 --> 00:53:05,370
talking with AI, and we talked
about

229
00:53:06,300 --> 00:53:25,320
having conversations and
interrogating, and I like to say
interrogate because I think it's
like that question answer thing
is, it's just kind of like such
a killer interaction. But, you
know, communications people, and
I guess, marketers who are
really well suited for this,
like, we have a very specific

230
00:53:25,320 --> 00:53:44,730
skill set in understanding the
way humans relate to each other.
That that gives us a leg up. But
we're not necessarily being
given that opportunity to show
that we have a leg up, or we're
not taking advantage of that, as
a profession. In all the places
that we can, we're letting you
know we're letting or letting is

231
00:53:44,730 --> 00:54:01,260
the wrong word. Organizations,
especially in enterprises, start
with it, because it's a tech
project. It's a tech Initiative,
or an infrastructure initiative,
you know, where they start with
digital, because it's on a
computer, you know, it's on the
web. But,

232
00:54:01,650 --> 00:54:06,900
you know, I would argue that
it's the creative

233
00:54:06,900 --> 00:54:12,600
people, it's the writers and the
thinkers, the folks who have to
create narratives

234
00:54:13,020 --> 00:54:13,590
out of

235
00:54:14,970 --> 00:54:17,220
various kinds of disparate
information.

236
00:54:18,240 --> 00:54:24,870
The people have to interpret
what's what others are really
thinking, what you think they're
thinking, you know,

237
00:54:26,460 --> 00:54:48,690
these skills, creativity,
curiosity, interpretation,
interrogation, interviewing, all
these things are the core skill
set of the future for AI. And,
you know, comps is I think,
uniquely qualified to do this.
So what does that mean for the
people that we're bringing into
the profession? What does that

238
00:54:48,690 --> 00:55:05,490
mean for the way that teams are
going to evolve? If indeed, and
I believe we will be but if
indeed we get the chance to show
our chops here and you know, we
really start to leverage AI to
really contribute to our
businesses and to our clients.
You

239
00:55:05,490 --> 00:55:21,660
Anne Green: make me think of
something I've said to my teams
a number of times over the
years. And I'll talk about in
the agency context, but it's
absolutely true of corporate
side and integrated marketing
communications. And I do use
that phrasing because I think
it's broad. There's so much as

240
00:55:21,660 --> 00:55:37,380
you said, permeability between
those, although some
organizations, obviously, comms
and marketing are separate, I
totally get that. But there's a
real symbiotic. But one thing
I've said to folks, especially
as they're growing in their
career, is to remind them,
especially those that are on the

241
00:55:37,380 --> 00:55:56,160
calm side, because we have many
different professionals here,
digital creative paid media,
project management, but overall,
this, this industry is a really
elite profession, in terms of
the skill sets required, if you
think about all the things that
we have to do and try to do
well, and we were very good at

242
00:55:56,160 --> 00:56:11,610
beating ourselves up to
especially in the client service
side, we're like, Oh, my God, I
need to be better at this. But
it's like writing, communicating
all types of forms of writing
speed project management
strategy, like, elite
communications, because most
stuff goes off the rails because

243
00:56:11,610 --> 00:56:33,180
humans aren't communicating
properly. You know, every
flexibility, resilience,
learning, learning, learning,
you know, researching, sourcing,
and analyzing, and spitting it
back out and says something that
is more valuable than just the
component parts. So I that same
speech I give to people, which

244
00:56:33,180 --> 00:56:48,300
by the way, sometimes people
have stood back and said, My
God, you're right, I never
conceived of myself this way.
And I'm like, Look, we can all
grow, we all have some issues.
But we need to start and
understand like how elite this
profession is and how much it
demands. So then I very much

245
00:56:48,300 --> 00:57:07,890
cosign everything you said about
these skill sets, I would add
research and sourcing to that.
And also the ethical lens, and
also the questioning we do have,
how does that work? Why does
that work? Where's that coming
from? When we are in a
corporation or at an agency? We
spent all the time interrogating

246
00:57:07,920 --> 00:57:26,010
our peers or our clients? Why is
that? Where's the proof? What's
the backup? How does that work?
Can I go to the factory? Can I
see it? Can I see who's the
coder? So one thing that I think
first and foremost that our
industry needs to do is start to
recognize and be mindful and
engaged with why it is that this

247
00:57:26,010 --> 00:57:43,620
these skill sets are well suited
for today. But that requires a
mindfulness and education
regarding all the things we've
been talking about, about what
AI is and is not and where is
going. I mean, we don't know
where it's going eventually, but
the way in which it's a more
discursive, exploratory

248
00:57:43,620 --> 00:58:01,650
interrogative building iterative
process, right? So that's the
first thing we need to take more
ownership of our
characteristics, and really be
mindful and say, Guys, this and
then we need to start to
practice it. And and we need to
be thinking very strongly and
and clearly and mindfully and

249
00:58:01,650 --> 00:58:22,170
intentionally about what is, say
an entry level role look like?
And what are the kinds of things
where we can help folks work
smarter, not harder, because the
curve I want every person here
to come up from intern on is I
want them to come up the curve
as a counselor, the counselor
has to think really deeply about

250
00:58:22,170 --> 00:58:38,040
something they have to self
educate, they have to learn
their industry, and they have to
be have their eyes up eyes in
the work and then eyes up to
understand what's happening. So
that's the dialogue I'm trying
to have. It's like, how do we
work smarter, so you know, own
our characteristics, and own

251
00:58:38,040 --> 00:58:48,300
wide is and that'll help you
actually learn to how to use AI.
And then figuring out let's
actually learn how to be smarter
so that we can offer more value
to ourselves and others faster.

252
00:58:48,750 --> 00:58:50,160
Daniel Nestle: Oh, you know, all
I keep thinking

253
00:58:50,430 --> 00:58:51,390
of is

254
00:58:52,710 --> 00:58:59,490
moving up the value chain, you
know, now, this whole idea of
being the counselor to the
trusted advisor,

255
00:59:00,240 --> 00:59:04,560
just the the level of

256
00:59:04,560 --> 00:59:09,540
understanding and knowledge that
you need to have about a
business, or about a client is
already

257
00:59:10,230 --> 00:59:14,070
deeper than anybody else in the
business. For the most part.

258
00:59:15,570 --> 00:59:27,960
Anne Green: Because you see
across more of a to that's one
thing that's like someone new
your position is having a
partner, it's not just
multistakeholder outside of your
organization's it's
multistakeholder inside. And
it's incumbent upon you to
understand I need to really

259
00:59:27,960 --> 00:59:36,600
understand the life of the
senior counsel, I need to
understand the life of our
finance team, I need to
understand the needs of our
research and development. Like
that's baked into it. I think

260
00:59:37,080 --> 00:59:40,050
Daniel Nestle: it's also you're
totally right. And it's also

261
00:59:40,920 --> 00:59:55,050
Anne Green: the only I think
it's the only role or the only
function, you know that that
just has to know everything like
across

262
00:59:55,050 --> 01:00:06,510
Daniel Nestle: the entire
organization, but for the
purpose of connecting dots. So
like It's one thing to say,
Yeah, I know what's going on in
that factory, or I know it's
gone. It doesn't know it, it
doesn't do anyone any good.

263
01:00:07,170 --> 01:00:10,650
If you can't add to that, in
some

264
01:00:10,650 --> 01:00:27,060
way, in an additive way that
nobody else is thinking over
that, or that people are not
necessarily in the mindset to
come up with, what is the dots,
what is the.or dots that need to
be connected with this
particular issue, you're on a
conversation, you're at a
meeting, if you're an advisor,

265
01:00:27,060 --> 01:00:44,550
and a counselor and the
communication side of things.
You know, you're in a leadership
meeting, and people are talking
about a people are talking about
B, you understand ABCD, EFG,
through y, and z, at a very, you
know, maybe at a surface level,
but you kind of get it and
you're like, wait, you skipped,

266
01:00:44,700 --> 01:00:45,540
you skipped E,

267
01:00:45,810 --> 01:00:54,750
or, you know what, I heard
something about this over there,
that you're not taking into
account. Or might what

268
01:00:54,750 --> 01:01:13,320
happens a lot, a lot more often
in corporate comms is, you know,
this is everything that you're
talking about with brand and
with product is terrific. There
is our corporate mission and
corporate sustainability. And
there were announcements by the
CEO last week, we can plug this
in here, because it's it not

269
01:01:13,320 --> 01:01:29,760
only supports what you're doing,
it enhances and maybe gives you
some new insights and
perspectives about how you might
want to better market this
product, or you know, what your
narrative should be. And it's
that additional information, the
additional, I guess,

270
01:01:31,620 --> 01:01:33,330
package packaging, of different

271
01:01:33,330 --> 01:01:44,460
pieces of information that you
have, you know, it's really
important for the comms person
to be the nerve center of an
organization or the comms team
to be the nerve center. And, you
know, back to AI.

272
01:01:46,140 --> 01:01:47,250
How can we,

273
01:01:48,270 --> 01:02:08,940
you work with AI to enhance that
role. And it part of it is,
well, we can, at the, if we're
in the if we're in the ground
level stuff, part of it is,
well, AI can now do a lot of the
grunt work, right, so that we
can now focus our minds higher,
which is really, really
important because time is your

274
01:02:08,940 --> 01:02:19,830
friend here. But there's also
ways that we are not we haven't
discovered yet, or that we're in
the process of discovering that
will enhance that

275
01:02:20,280 --> 01:02:21,270
those connections

276
01:02:22,500 --> 01:02:38,430
that will enhance what you're
doing. If I have the I'm at the
point now, where, you know, if I
have ideas about something, I
may not blurt them out in a
meeting, but I'll take I'll take
them away. And I'll create a
mind map or something. And, and
by the way, when you talk about
AI coming to you, I use a tool

277
01:02:38,430 --> 01:02:59,940
called X mind and X mind AI now
has the capability for as you're
creating a mind map, you click
on one of the branches of the
mind map and you hit the little
AI buttons going to come up with
seven 810 potential branches to
that, to that to that branch. So
it's a killer brainstorming app.
You know, that said, you know,

278
01:03:00,930 --> 01:03:18,870
I'll take my idea. So go back to
think of it think it through.
But now I have a friend, I have
a little, little smart friend,
my friend, I have a colleague
calls it your smart friend, I
call it my intern, our CEO calls
your power suit, however you
want to describe it. Now you
have this capability to say you

279
01:03:18,870 --> 01:03:33,300
know what, I've been thinking
about these things. I need you
to act like a strategic adviser.
And I need you to kind of walk
me through a few things. This is
my, this is what I heard. This
is what I'm thinking, how can
you help me connect these things
together?

280
01:03:34,050 --> 01:03:35,610
Can you validate what I'm
thinking?

281
01:03:36,690 --> 01:03:53,700
Can you come up with some some
other ideas? You know, and it's
like having this just new
employee who's super smart, but
but quite dumb, because they're
not directed? You know, so you
have to direct it properly, but
then you're gonna get so much
out of it, you know, and they
lie. So it's a little it's a

282
01:03:53,700 --> 01:03:57,120
little intern that lies
sometimes doesn't know what's
lying. But I

283
01:03:57,120 --> 01:04:04,680
Anne Green: know, we haven't
talked about the beautiful
euphemism of hallucinations, but
just, you know, part of the wild
world that we live in right now.
Well,

284
01:04:04,680 --> 01:04:20,730
Daniel Nestle: sometimes they're
straight up lies, and sometimes
they are hallucinations. You
know, you mentioned to Gemini
before and I've had my issues
verified. We'll have to save
that for another time. I think
we have we have so much here
that we've spoken about. And I
really appreciate your the

285
01:04:20,730 --> 01:04:36,780
framework that you offered,
really about how we're
approaching AI at these at these
three levels on the ground
level, and then the intentional
engagement part part of the
process. And moving on to this
philosophical question, and I'm
going to, I'm going to leave it
with that that philosophical

286
01:04:36,780 --> 01:04:54,870
question almost. The last with
the last thing I want to ask you
before we wrap up, which is
maybe not related to AI, but
maybe it could be but it's just
this the question I asked a lot
of my guests which is what's
keeping you up at night these
days and and, you know, if it's
related to everything we've

287
01:04:54,870 --> 01:05:01,290
talked about, that's cool, too.
But, you know, what's like,
what's keeping up what's the
last things that you would your
last words she would say to our
listeners,

288
01:05:01,950 --> 01:05:21,240
Anne Green: oh my gosh, I don't
want to become very dystopic. At
the end of our wonderful
conversation. I am kept up in
with a connection with what
we've been talking about with
just the proliferation of
divisive pneus and fake stuff
online and very much engineered.
You know, I really put it in the

289
01:05:21,240 --> 01:05:42,210
category of disinformation or
also the convenience to many
actors around the world, both
either for money or for nation
state meddling to rile people up
and create rancor and anger. And
that is upsetting to me, I I'm
thinking a lot about the fact
that there is many issues that
are could become politicized,

290
01:05:42,210 --> 01:05:57,810
that I think are actually really
just important for us as a human
society, like inclusion, and
understanding the differences of
strength. And that's true of the
states and the political
landscape with the election
year. So that stuff keeps me up.
But I think also just the state
of the world I want to I was a

291
01:05:57,810 --> 01:06:15,240
Star Trek fan for years. In Star
Trek, there was sort of a
feeling underneath of
utopianism. And that as much as
humans would screw things up
that we the plucky humans would
pull it out at the end. That was
really Captain Kirk's whole
personality. While Spock was
like, I wouldn't advise that and

292
01:06:15,240 --> 01:06:35,100
Captain Kirk's, like, Let's go
for it. We're the plucky humans.
So I'm really hopeful. While I'm
also worried about where we're
at right now, because we're at a
lot of crossroads, as usual. So
running a business, I love to
see resilience, I love to see
the economy doing well, and
dealing with a lot of weird

293
01:06:35,130 --> 01:06:49,890
curveballs. And I'm really
grateful the resilience of our
people. So I have a lot of
confidence, but I'm definitely
watching the big picture
landscape. And the AI is a big
part of that. Because when
you're asked to shepherd, an
organization, whether you're
internal or running it as I am,

294
01:06:50,400 --> 01:06:55,290
you know, that's a lot of
responsibility. And I feel the
need to keep my eye on the big
picture here.

295
01:06:55,620 --> 01:07:15,750
Daniel Nestle: Yeah. And thank
goodness you are I mean, we need
more people like, like you who
are looking at the big picture.
I would like to be a little
little hopeful. With Yeah, my
thinking about AI. And, and my
hope is that humans, by nature,
are inventive, and creative.
And, you know, again, back to

296
01:07:15,750 --> 01:07:30,870
what we, where we started, this
is not a zero sum game, things
are going to change and continue
to change. And they have the
potential to really change in a
terrific way. On that note, and
I think our listeners, if they
want to reach out to you, they
can just go to je s
communications.com. They can

297
01:07:30,870 --> 01:07:39,570
look up and green on LinkedIn,
that's an with an E and Jean
with a green with no E. I often
had them fuse, but and green
anyplace else that people should
be looking for you and really

298
01:07:39,570 --> 01:07:54,180
Anne Green: LinkedIn. Yeah, and
green, and she is
communications. That's our
website. And you know, I'm
always happy to connect. These
are big picture issues. I love
being in touch with folks like
you, Dan and others talking
about these. So I do welcome
people to reach out and connect.

299
01:07:54,360 --> 01:08:01,410
Daniel Nestle: Thanks so much.
And I know we're going to be in
touch a lot more in the near
future. So again, thanks. Thanks
for coming on the trending
communicator and appreciate it.
Thank you.

300
01:08:07,920 --> 01:08:24,990
Thanks for taking the time to
listen in on today's
conversation. If you enjoyed it,
please be sure to subscribe
through the podcast player of
your choice. Share with your
friends and colleagues and leave
me a review. Five stars would be
preferred. But it's up to you.
You have ideas for future guests

301
01:08:24,990 --> 01:08:32,310
or you want to be on the show.
Let me know at Dan at trending
communicator.com Thanks again
for listening to the trending
communicator.