Manage episode 200697676 series 1854740
What happens when you no longer control your own likeness? Is there an ethical line to be crossed with posthumous product spokesmanship? We skirt the line in this episode and get topical - talking about the subject of ethics and artificial intelligence and how online communities have banned "Deep Fakes" - pornographic simulations produced by artificial intelligence.
Deep Fakes: weaponizing AI
We're seeing a really gross intersection of what we talked about on our prediction show around digital personal identity rights with body timage and data technology and how it advances with consumer products.
The Verge and an AP article both discuss the emergence of "deep fakes:" applying people's likenesses using AI in a pornographic way. These communities take hi-res videos and still frames from notable actresses for training data and apply their likeness to nude photos.
There's no real legal consensus on deep fakes and their consequences, so a lot of these online sites have come together and banned them and their communities.
This is hitting right on the topic of that scary, black mirror-esque world we now live in where your face can be unwittingly applied without your consent to literally any context in some of the seediest and darkest ways with no way for you to manage it.
Legal Ramifications of Deep Fakes
The legal ramifications are unclear because we've never had this sophisticated level of technology.
This is something that will come up in law, and we'll probably start to see entire bills at the federal level.
There's been no federal regulations yet that address how to handle your body data.
The Historical Blind Eye to Invasive Technology
It's troubling that there were a lot of communities that turned a blind eye for years to still images.
There was this incredible story in Wired about 10-12 years ago about how Gillian Anderson at one point was the most photoshopped face on the internet, and many of the photos were suggestive.
They were suggesting it was because of facial symmetry.
Those types of images have been around for decades with no one doing anything about it.
We can all agree that it's harmful to somebody in some way when they're applying your likeness in that way.
The advent of AI assisted fakery is taking it to the next level and blurring the line of realism.
Incredibly complex technology in the hands of the people:
We were talking to Greg Steinberg at Something Digital and he asked hypothetically, what if we applied this to products? You could change any scene, from commercials to videos, to represent your product with AI.
You could apply the Coke filter to any image and anything anyone holding is a can of coke.
Amazing movie technology is now available in the palm of your hand.
It's now available for consumers and businesses to take advantage of in a pretty easy way.
In 2016 at the Adobe max creativity conference they announced a tool that with creative cloud suite, that after 20 minutes of training spoken word data, you could train an AI or ML algorithm to parrot back phrases in another person's voice that you typed in.
A year and a half ago tech demo showcased a face to face algorithm that was applied to fake CNN broadcasts that used a source actor and overlaid that actor with political figures to show that GW and Vladimir Putin saying things they didn't actually say.
Is there even one way that this is a positive contribution to society?
Manipulative technology for sales
There are ways it could be used to leverage selling things, and businesses can use this type of tech as a tool.
We touched on this in episode 8.
People will have control of their body data and can leverage it for various reasons: they can sell it, it will outlast you posthumously, and you'll need someone to monitor and be in charge of it after you die.
For models (and maybe everyone's a model now) they'll be able to give companies access to their body data for specific reasons.
The question is: how are we going to enforce this?
It's such powerful tech that if we don't have good governance than it's going to get out of control real quick.
But just because we legislate this new reality, it doesn't mean that it's going to control people's behavior. Just because it's illegal won't make it cease to exist.
The difficult and expensive road to wiping your image from the internet
What's required to wipe your image off the internet, especially if it's someone's likeness or personal image, takes a lot of work: people have to trademark their face, or send DMCA take down notices to sites like Reddit to actually enforce their copyright.
At some point, what's coming in weaponized tech and disinformation campaigns. In no way is it helpful for humankind.
Policy Update with Danny Sepulveda!
The political traction net neutrality has is fascinating.
What happened immediately, and even before Ajit Pai repealed it, was a fairly widespread uprising of folks supporting net neutrality.
Nonetheless the department went forward with repealing it.
A number of States have gone forward with their own net neutrality rules.
The original net neutrality rules were over 400 page long with fairly complex issues.
"I've been working in this for over 20 years, and I've never seen an issue that's gotten so much traffic."
Reasons for the traction:
People love the idea of the internet as a public space open and accessible to everyone as a relatively egalitarian basis
People don't appreciate a regulator behaving in the best interest of the regulated as opposed to the interests of the public.
Republicans believe that if you own pipes going into somebody's house, you should have the freedom to contract with the providers for different treatment for better ROI, and consequently, this would encourage additional investment in infrastructure around the country.
But there's a tremendous amount of incentive to manipulate that gatekeeper function for non-productive ends to extract tolls and rents.
Most Democrats believe net neutrality should be upheld because it works.
The way people access internet now without intervention with internet service provider has worked really well with innovation.
Congress is considering repealing the FCC's decision.
It's highly unlikely to work due to Republican control of both Congress and the house.
Right now 49 Senators wish for repeal. We only need 1 or 2 more Senators to agree to repeal the rule, but it's highly unlikely the House would agree.
Even if the House voted to go back to net neutrality, the President could still veto the effort.
We are unlikely to see a restoration of net neutrality during this administration.
There are also lawsuits against it right now.
The courts could throw it out. Which would return us to Obama era rules.
Once the courts decide, either way, it will create a political dynamic in which members of Congress will have to come to some decision about whether they wish to write into law some kind of compromise.
In all likelihood, Net neutrality won't be restored.
But there's been a lot of activism, and we'll see what it means politically in the midterms going forward.
BACK TO THE DUDES
In a commerce context, body data is really useful. But using it to accomplish things with people's image is just dangerous.
We don't see it not being used.
It's a tech that exists now, and it'll be used by businesses, and they'll find use cases for it. Now that it exists, we can't go back.
Aside from spokesperson and generational licensing groups like the Elvis and Marilyn Monroe estates, all I see this being novel for in a commerce context is us having more and more Reba Mcentire and KFC Colonel Sanders mash ups.
We don't need more Jim Gaffigan colonel Sanders to make me buy fried chicken, but that's where we're heading.
Consider though, the Micro-Spokesperson: using AI to determine the best person to influence another set of customers.
That influencer will sell their digital body rights to influence a certain set of people based of specific sets of DNA factors.
DNA TESTING, FOLKS!
23andme was spamming the heck out of us on the Winter Olympics.
If you've watched the Olympics, you probably saw the ads at least 50 times.
It's just one example of new DNA testing groups.
There's a ton of other really specific stuff going on with DNA testing.
It's getting better and better, and you're able to determine more stuff with it.
A company is matching DNA to medications: you get your DNA scanned and then get better understanding of what medications will work better for you based off your results.
It's personalized medication for you.
BACK TO INFLUENCERS! (Honey I shrunk the influencers.)
Venturebeat talks about Influential, a company that just launched a social intelligence platform. They find influencers for brands using the help of IBM Watson.
They can find people based on microsegment affinities to predict whether or not they would be influential for a brand for micro-influencer engagement.\ Imagine if they took training data from dating apps, and then used that data to help create influencers based off of attraction factors that would allow people to trust somebody more, or like them more, because they look a specific way ro have a certain personality. (We so need GDPR in the US).
Who will influence the influencers?
Everything is happening on instagram.
There's a story on L2 called, "Can Nike keep snapchat alive?"
Nike was the first company to sell directly on snapchat, and that collaboration is signaling that snapchat might be moving into e commerce.
But in the same week, Kendall Jenner tweeted the snapchat 1 milliion dollar dip in their market valuation.
Even when you're doing interesting things in retail, when influencers are doing things in retail, and have the products to engage in 1 to 1, even then it comes down to a handful of people having the eyeballs to really determine the fate of those platforms.
So there are influencers for the influencers.
The success will be in if you can keep the attention of the people who matter.
And there's no amount of AI to keep the attention of capricious people.
Ad Age recently talked about all the data that shows how micro-influencers are having an insane affect on people above and beyond the standard celebrity influencers.
If you're a brand, you probably don't want a big celebrity, you probably want a series of micro-influencers.
Instagram influencers wouldn't traditionally have any corporate sponsorship, but they do because they have million and millions of eyeballs.
It's only because of their engagement in social. It has nothing to do with any accolade or aptitude.
15-20 years ago you'd have to be an athlete or actor to gain it.
Now anybody can do it for just about anything for anybody.
Or we can fake you with AI.
Back to Body Data!
Shoutout to Shapescale.com: a 3D body scanning tool for fitness tracking and visualization. You stand on their scale and it records your body and then you can get a picture of yourself from a 3D view and actually visualize different things, like how you should go about changing your body to see what you want to do.
It looks at fat and muscle mass, and you get heat maps of where your body's changing, and you get visual goal tracking.
It's marketed as the next gen of scales beyond the "smart" scale we have no.
Beyond it being "cool," but we have to wonder, where does it go from here other than being cool?
Perhaps you can mine the data and do your own A/B tests on your body?
It does do is allow someone to attack weight loss or health like a business problem: and treat their life like something they can test and try something out on.
Despite this wealth of technology and data, we're more depressed than we've ever been as a country.
Maybe it's not actually helping us.
That concludes our awesome meandering and tangential show, and we'd love to hear what you have to say. Go to futurecommerce.fm. Hit us up and lend more to conversation. Or Email us at firstname.lastname@example.org and email@example.com
111 episodes available. A new episode about every 9 days averaging 43 mins duration .