<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" xmlns:media="http://search.yahoo.com/mrss/" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd">
    <channel>
        <itunes:owner>
            <itunes:name>Video Archive – The Conference by Media Evolution</itunes:name>
            <itunes:email>theconference@mediaevolution.se</itunes:email>
        </itunes:owner>
        <title>Video Archive – The Conference by Media Evolution</title>
        <link>https://videos.theconference.se</link>
        <description>Media Evolution is a membership organization that help media industries to innovate and grow.

The videos in this podcast are generated at our annual conference The Conference and lectures we arrange throughout the year.

http://www.mediaevolution.se</description>
        <language>en-us</language>
        <generator>Visualplatform</generator>
        <docs>http://blogs.law.harvard.edu/tech/rss</docs>
        <itunes:author>Video Archive – The Conference by Media Evolution</itunes:author>
        <itunes:subtitle>Videos generated by Media Evolution</itunes:subtitle>
        <itunes:summary>Media Evolution is a membership organization that help media industries to innovate and grow.

The videos in this podcast are generated at our annual conference The Conference and lectures we arrange throughout the year.

http://www.mediaevolution.se</itunes:summary>
        <itunes:keywords>media, music, games, publishing, future, social, tv, film, 334841</itunes:keywords>
        <itunes:type>episodic</itunes:type>
        <itunes:explicit>no</itunes:explicit>
        <itunes:image href="https://videos.theconference.se/files/rv0.0/sitelogo.gif"/>
        
        <atom:link rel="self" href="https://videos.theconference.se/rss/uploaded/2023/08/31"/>
        <atom:link rel="next" href="https://videos.theconference.se/rss/uploaded/2023/08/31?p=2&amp;year=2023&amp;podcast%5fp=f&amp;day=31&amp;month=08&amp;datemode=uploaded&amp;https="/>
        <item>
            <enclosure url="http://videos.theconference.se/64968568/88204385/11a29bd69d01148878ae0ec440806a63/video_medium/qa-creative-assemblages-video.mp4?source=podcast" type="video/mp4" length="41897700"/>
            <title>Q&amp;A – Creative Assemblages</title>
            <link>http://videos.theconference.se/qa-creative-assemblages</link>
            <description>&lt;p&gt;Q&amp;amp;A from the session Creative Assemblages – Emerging Alliances to Augment Human Creativity with&amp;nbsp;Kristoffer Ørum (Artist) and&amp;nbsp;Kader Bagli (RISE | Visual Effects Studio)&amp;nbsp;&lt;/p&gt;&lt;p&gt;&lt;a href="http://videos.theconference.se/qa-creative-assemblages"&gt;&lt;img src="http://videos.theconference.se/64968568/88204385/11a29bd69d01148878ae0ec440806a63/standard/download-8-thumbnail.jpg" width="75" height=""/&gt;&lt;/a&gt;&lt;/p&gt;</description>
            <guid>http://videos.theconference.se/photo/88204385</guid>
            <pubDate>Fri, 01 Sep 2023 17:14:04 GMT</pubDate>
            <media:title>Q&amp;A – Creative Assemblages</media:title>
            <itunes:summary>QA from the session Creative Assemblages – Emerging Alliances to Augment Human Creativity withKristoffer Ørum (Artist) andKader Bagli (RISE | Visual Effects Studio)</itunes:summary>
            <itunes:subtitle>QA from the session Creative Assemblages – Emerging Alliances to Augment Human Creativity withKristoffer Ørum (Artist) andKader Bagli (RISE | Visual Effects Studio)</itunes:subtitle>
            <itunes:author>Video Archive – The Conference by Media Evolution</itunes:author>
            <itunes:duration>14:27</itunes:duration>
            <media:description type="html">&lt;p&gt;Q&amp;amp;A from the session Creative Assemblages – Emerging Alliances to Augment Human Creativity with&amp;nbsp;Kristoffer Ørum (Artist) and&amp;nbsp;Kader Bagli (RISE | Visual Effects Studio)&amp;nbsp;&lt;/p&gt;&lt;p&gt;&lt;a href="http://videos.theconference.se/qa-creative-assemblages"&gt;&lt;img src="http://videos.theconference.se/64968568/88204385/11a29bd69d01148878ae0ec440806a63/standard/download-8-thumbnail.jpg" width="75" height=""/&gt;&lt;/a&gt;&lt;/p&gt;</media:description>
            <media:content url="//videos.theconference.se/v.ihtml/player.html?token=11a29bd69d01148878ae0ec440806a63&amp;source=podcast&amp;photo%5fid=88204385" width="625" height="352" type="text/html" medium="video" duration="867" isDefault="true" expression="full"/>
            <media:thumbnail url="http://videos.theconference.se/64968568/88204385/11a29bd69d01148878ae0ec440806a63/standard/download-8-thumbnail.jpg" width="75" height=""/>
            <itunes:image href="http://videos.theconference.se/64968568/88204385/11a29bd69d01148878ae0ec440806a63/standard/download-8-thumbnail.jpg/thumbnail.jpg"/>
            <category>2023</category>
            <category>creative assemblages</category>
        </item>
        <item>
            <enclosure url="http://videos.theconference.se/64968568/88203915/52e292ee0e5f83402df3b95dc9bf6aa9/video_medium/kristoffer-orum-even-images-that-video.mp4?source=podcast" type="video/mp4" length="53093739"/>
            <title>Kristoffer Ørum – Even Images That You Know To Be False Affect You </title>
            <link>http://videos.theconference.se/kristoffer-orum-even-images-that</link>
            <description>&lt;p&gt;&lt;span&gt;&lt;p&gt;Kristoffer Ørum finds satisfaction in taking a particular technology that is meant for something else and misusing it. In his current unfinished Instagram project that unfolds over time, the artist uses AI to create a version of history. The generated pictures combine 90s hip-hop culture, the Danish worker movement, fishermen's culture, and the health care system. But the more you look at it, the more you notice its imperfections. That is Kristoffer’s way of imagining and reimagining the past. Why? Because any discussion of the past is a discussion of the future.&lt;/p&gt;&lt;p&gt;We need to rediscover our own imagination. Kristoffer criticises the lack of imaginary futures and our incapability to get out of the future narratives dictated by tech giants. Also, the “horrible shitshow” that is social media, where we constantly get bombarded with fake and manipulated images, contributes to that problem. So he hopes that with projects like his that play with past, future, and (re)imagination, we can regain the power to imagine futures not defined by tech monopolies.&lt;/p&gt;&lt;div&gt;&lt;span&gt;&lt;br&gt;&lt;/span&gt;&lt;/div&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;a href="http://videos.theconference.se/kristoffer-orum-even-images-that"&gt;&lt;img src="http://videos.theconference.se/64968568/88203915/52e292ee0e5f83402df3b95dc9bf6aa9/standard/download-8-thumbnail.jpg" width="75" height=""/&gt;&lt;/a&gt;&lt;/p&gt;</description>
            <guid>http://videos.theconference.se/photo/88203915</guid>
            <pubDate>Fri, 01 Sep 2023 17:13:49 GMT</pubDate>
            <media:title>Kristoffer Ørum – Even Images That You Know To Be False Affect You </media:title>
            <itunes:summary>Kristoffer Ørum finds satisfaction in taking a particular technology that is meant for something else and misusing it. In his current unfinished Instagram project that unfolds over time, the artist uses AI to create a version of history. The generated pictures combine 90s hip-hop culture, the Danish worker movement, fishermen's culture, and the health care system. But the more you look at it, the more you notice its imperfections. That is Kristoffer’s way of imagining and reimagining the past. Why? Because any discussion of the past is a discussion of the future.We need to rediscover our own imagination. Kristoffer criticises the lack of imaginary futures and our incapability to get out of the future narratives dictated by tech giants. Also, the “horrible shitshow” that is social media, where we constantly get bombarded with fake and manipulated images, contributes to that problem. So he hopes that with projects like his that play with past, future, and (re)imagination, we can regain the power to imagine futures not defined by tech monopolies.</itunes:summary>
            <itunes:subtitle>Kristoffer Ørum finds satisfaction in taking a particular technology that is meant for something else and misusing it. In his current unfinished Instagram project that unfolds over time, the artist uses AI to create a version of history. The...</itunes:subtitle>
            <itunes:author>Video Archive – The Conference by Media Evolution</itunes:author>
            <itunes:duration>18:04</itunes:duration>
            <media:description type="html">&lt;p&gt;&lt;span&gt;&lt;p&gt;Kristoffer Ørum finds satisfaction in taking a particular technology that is meant for something else and misusing it. In his current unfinished Instagram project that unfolds over time, the artist uses AI to create a version of history. The generated pictures combine 90s hip-hop culture, the Danish worker movement, fishermen's culture, and the health care system. But the more you look at it, the more you notice its imperfections. That is Kristoffer’s way of imagining and reimagining the past. Why? Because any discussion of the past is a discussion of the future.&lt;/p&gt;&lt;p&gt;We need to rediscover our own imagination. Kristoffer criticises the lack of imaginary futures and our incapability to get out of the future narratives dictated by tech giants. Also, the “horrible shitshow” that is social media, where we constantly get bombarded with fake and manipulated images, contributes to that problem. So he hopes that with projects like his that play with past, future, and (re)imagination, we can regain the power to imagine futures not defined by tech monopolies.&lt;/p&gt;&lt;div&gt;&lt;span&gt;&lt;br&gt;&lt;/span&gt;&lt;/div&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;a href="http://videos.theconference.se/kristoffer-orum-even-images-that"&gt;&lt;img src="http://videos.theconference.se/64968568/88203915/52e292ee0e5f83402df3b95dc9bf6aa9/standard/download-8-thumbnail.jpg" width="75" height=""/&gt;&lt;/a&gt;&lt;/p&gt;</media:description>
            <media:content url="//videos.theconference.se/v.ihtml/player.html?token=52e292ee0e5f83402df3b95dc9bf6aa9&amp;source=podcast&amp;photo%5fid=88203915" width="625" height="352" type="text/html" medium="video" duration="1084" isDefault="true" expression="full"/>
            <media:thumbnail url="http://videos.theconference.se/64968568/88203915/52e292ee0e5f83402df3b95dc9bf6aa9/standard/download-8-thumbnail.jpg" width="75" height=""/>
            <itunes:image href="http://videos.theconference.se/64968568/88203915/52e292ee0e5f83402df3b95dc9bf6aa9/standard/download-8-thumbnail.jpg/thumbnail.jpg"/>
            <category>2023</category>
            <category>creative assemblages</category>
        </item>
        <item>
            <enclosure url="http://videos.theconference.se/64968578/88203715/263fb172ba081715360d29c95d7bf73a/video_medium/kader-bagli-bringing-nuanced-video.mp4?source=podcast" type="video/mp4" length="51570606"/>
            <title>Kader Bagli – Bringing Nuanced Futures to Life</title>
            <link>http://videos.theconference.se/kader-bagli-bringing-nuanced</link>
            <description>&lt;p&gt;&lt;span&gt;&lt;p&gt;&lt;b&gt;“We all deserve to be seen in visual languages. We already have the tool, now it’s about taking action”&lt;/b&gt;&lt;span&gt;&lt;br&gt;&lt;br&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;Could merging VFX and AI create a fully represented world? Kader Bagli certainly thinks so. Working at the intersection of creativity, imagination and technology, she is exposed to a lack of diversity on a daily basis. Our current visual storytelling landscape doesn’t celebrate cultural diversity across all layers of society – we are good at marginalising groups instead of normalizing them. But the good news is: we already have the tools to change that; we just need to put it into action to unstuck ourselves.&amp;nbsp;&lt;/p&gt;&lt;p&gt;&lt;br&gt;&lt;/p&gt;&lt;p&gt;With AI putting a halt to our lack of imagination and VFX offering storytellers new perspectives, the future looks hopeful. Because we all deserve to be seen — on screen and off — with dignity, humanity and no judgment. So, what are we waiting for if that’s what it takes to change our visual language to a more inclusive one?&lt;/p&gt;&lt;div&gt;&lt;span&gt;&lt;br&gt;&lt;/span&gt;&lt;/div&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;a href="http://videos.theconference.se/kader-bagli-bringing-nuanced"&gt;&lt;img src="http://videos.theconference.se/64968578/88203715/263fb172ba081715360d29c95d7bf73a/standard/download-6-thumbnail.jpg" width="75" height=""/&gt;&lt;/a&gt;&lt;/p&gt;</description>
            <guid>http://videos.theconference.se/photo/88203715</guid>
            <pubDate>Fri, 01 Sep 2023 17:13:36 GMT</pubDate>
            <media:title>Kader Bagli – Bringing Nuanced Futures to Life</media:title>
            <itunes:summary>“We all deserve to be seen in visual languages. We already have the tool, now it’s about taking action”Could merging VFX and AI create a fully represented world? Kader Bagli certainly thinks so. Working at the intersection of creativity, imagination and technology, she is exposed to a lack of diversity on a daily basis. Our current visual storytelling landscape doesn’t celebrate cultural diversity across all layers of society – we are good at marginalising groups instead of normalizing them. But the good news is: we already have the tools to change that; we just need to put it into action to unstuck ourselves.With AI putting a halt to our lack of imagination and VFX offering storytellers new perspectives, the future looks hopeful. Because we all deserve to be seen — on screen and off — with dignity, humanity and no judgment. So, what are we waiting for if that’s what it takes to change our visual language to a more inclusive one?</itunes:summary>
            <itunes:subtitle>“We all deserve to be seen in visual languages. We already have the tool, now it’s about taking action”Could merging VFX and AI create a fully represented world? Kader Bagli certainly thinks so. Working at the intersection of creativity,...</itunes:subtitle>
            <itunes:author>Video Archive – The Conference by Media Evolution</itunes:author>
            <itunes:duration>17:45</itunes:duration>
            <media:description type="html">&lt;p&gt;&lt;span&gt;&lt;p&gt;&lt;b&gt;“We all deserve to be seen in visual languages. We already have the tool, now it’s about taking action”&lt;/b&gt;&lt;span&gt;&lt;br&gt;&lt;br&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;Could merging VFX and AI create a fully represented world? Kader Bagli certainly thinks so. Working at the intersection of creativity, imagination and technology, she is exposed to a lack of diversity on a daily basis. Our current visual storytelling landscape doesn’t celebrate cultural diversity across all layers of society – we are good at marginalising groups instead of normalizing them. But the good news is: we already have the tools to change that; we just need to put it into action to unstuck ourselves.&amp;nbsp;&lt;/p&gt;&lt;p&gt;&lt;br&gt;&lt;/p&gt;&lt;p&gt;With AI putting a halt to our lack of imagination and VFX offering storytellers new perspectives, the future looks hopeful. Because we all deserve to be seen — on screen and off — with dignity, humanity and no judgment. So, what are we waiting for if that’s what it takes to change our visual language to a more inclusive one?&lt;/p&gt;&lt;div&gt;&lt;span&gt;&lt;br&gt;&lt;/span&gt;&lt;/div&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;a href="http://videos.theconference.se/kader-bagli-bringing-nuanced"&gt;&lt;img src="http://videos.theconference.se/64968578/88203715/263fb172ba081715360d29c95d7bf73a/standard/download-6-thumbnail.jpg" width="75" height=""/&gt;&lt;/a&gt;&lt;/p&gt;</media:description>
            <media:content url="//videos.theconference.se/v.ihtml/player.html?token=263fb172ba081715360d29c95d7bf73a&amp;source=podcast&amp;photo%5fid=88203715" width="625" height="352" type="text/html" medium="video" duration="1065" isDefault="true" expression="full"/>
            <media:thumbnail url="http://videos.theconference.se/64968578/88203715/263fb172ba081715360d29c95d7bf73a/standard/download-6-thumbnail.jpg" width="75" height=""/>
            <itunes:image href="http://videos.theconference.se/64968578/88203715/263fb172ba081715360d29c95d7bf73a/standard/download-6-thumbnail.jpg/thumbnail.jpg"/>
            <category>2023</category>
            <category>creative assemblages</category>
        </item>
        <item>
            <enclosure url="http://videos.theconference.se/64968567/88204429/abbdbfdd0554cdb8019006e1a1ffa08f/video_medium/qa-humanity-aimplified-video.mp4?source=podcast" type="video/mp4" length="22921714"/>
            <title>Q&amp;A – Humanity, (AI)mplified</title>
            <link>http://videos.theconference.se/qa-humanity-aimplified</link>
            <description>&lt;p&gt;Q&amp;amp;A from the session&amp;nbsp;Humanity, (AI)mplified – The AI Tools We Use to Be More Human with&amp;nbsp;Laura Herman (Oxford Internet Institute,)
Ovetta Sampson (Google) and Charlotte Högberg (Lund University)&amp;nbsp;&lt;/p&gt;&lt;p&gt;&lt;a href="http://videos.theconference.se/qa-humanity-aimplified"&gt;&lt;img src="http://videos.theconference.se/64968567/88204429/abbdbfdd0554cdb8019006e1a1ffa08f/standard/download-8-thumbnail.jpg" width="75" height=""/&gt;&lt;/a&gt;&lt;/p&gt;</description>
            <guid>http://videos.theconference.se/photo/88204429</guid>
            <pubDate>Fri, 01 Sep 2023 17:11:49 GMT</pubDate>
            <media:title>Q&amp;A – Humanity, (AI)mplified</media:title>
            <itunes:summary>QA from the sessionHumanity, (AI)mplified – The AI Tools We Use to Be More Human withLaura Herman (Oxford Internet Institute,)
Ovetta Sampson (Google) and Charlotte Högberg (Lund University)</itunes:summary>
            <itunes:subtitle>QA from the sessionHumanity, (AI)mplified – The AI Tools We Use to Be More Human withLaura Herman (Oxford Internet Institute,)
Ovetta Sampson (Google) and Charlotte Högberg (Lund University)</itunes:subtitle>
            <itunes:author>Video Archive – The Conference by Media Evolution</itunes:author>
            <itunes:duration>06:09</itunes:duration>
            <media:description type="html">&lt;p&gt;Q&amp;amp;A from the session&amp;nbsp;Humanity, (AI)mplified – The AI Tools We Use to Be More Human with&amp;nbsp;Laura Herman (Oxford Internet Institute,)
Ovetta Sampson (Google) and Charlotte Högberg (Lund University)&amp;nbsp;&lt;/p&gt;&lt;p&gt;&lt;a href="http://videos.theconference.se/qa-humanity-aimplified"&gt;&lt;img src="http://videos.theconference.se/64968567/88204429/abbdbfdd0554cdb8019006e1a1ffa08f/standard/download-8-thumbnail.jpg" width="75" height=""/&gt;&lt;/a&gt;&lt;/p&gt;</media:description>
            <media:content url="//videos.theconference.se/v.ihtml/player.html?token=abbdbfdd0554cdb8019006e1a1ffa08f&amp;source=podcast&amp;photo%5fid=88204429" width="625" height="352" type="text/html" medium="video" duration="369" isDefault="true" expression="full"/>
            <media:thumbnail url="http://videos.theconference.se/64968567/88204429/abbdbfdd0554cdb8019006e1a1ffa08f/standard/download-8-thumbnail.jpg" width="75" height=""/>
            <itunes:image href="http://videos.theconference.se/64968567/88204429/abbdbfdd0554cdb8019006e1a1ffa08f/standard/download-8-thumbnail.jpg/thumbnail.jpg"/>
            <category>2023</category>
            <category>ai</category>
            <category>humanity (ai)mplified</category>
        </item>
        <item>
            <enclosure url="http://videos.theconference.se/64968566/88203554/c2bc5d032669fb4fbfc812fd5812ef78/video_medium/charlotte-hogberg-the-doctor-is-video.mp4?source=podcast" type="video/mp4" length="37170256"/>
            <title>Charlotte Högberg -  The Doctor is In: Assistive Intelligence in Healthcare</title>
            <link>http://videos.theconference.se/charlotte-hogberg-the-doctor-is</link>
            <description>&lt;p&gt;&lt;span&gt;&lt;p&gt;How can AI be used for the common good or more precisely in healthcare? Charlotte’s work explores what futures are possible and desirable but also what we are at risk of losing. AI's role in healthcare can have harmful effects but also huge potential benefits. She emphasises the need to raise vital questions and consider consequences. Awareness of nuances, understanding risks and avoiding unethical technology is key because this touches upon high stake decisions (literally about life and death).&amp;nbsp;&amp;nbsp;&lt;/p&gt;&lt;br&gt;&lt;p&gt;&lt;b&gt;“We need to be in the often uncomfortable spaces in between - at least for a while”&lt;/b&gt;&lt;/p&gt;&lt;br&gt;&lt;p&gt;Constant critical engagement and trust calibration is needed because there are no quick fixes for these new emerging issues - for example how much can we augment until humans are only operators - and is this something we want to strive towards?&lt;/p&gt;&lt;br&gt;&lt;p&gt;She closes her talk with a both hopeful and cautious remark. She advocates daring to dream of AI-specialist collaboration, while maintaining agency, responsible exchange and upholding medical ethics and common goals.&lt;/p&gt;&lt;div&gt;&lt;span&gt;&lt;br&gt;&lt;/span&gt;&lt;/div&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;a href="http://videos.theconference.se/charlotte-hogberg-the-doctor-is"&gt;&lt;img src="http://videos.theconference.se/64968566/88203554/c2bc5d032669fb4fbfc812fd5812ef78/standard/download-7-thumbnail.jpg" width="75" height=""/&gt;&lt;/a&gt;&lt;/p&gt;</description>
            <guid>http://videos.theconference.se/photo/88203554</guid>
            <pubDate>Fri, 01 Sep 2023 17:11:28 GMT</pubDate>
            <media:title>Charlotte Högberg -  The Doctor is In: Assistive Intelligence in Healthcare</media:title>
            <itunes:summary>How can AI be used for the common good or more precisely in healthcare? Charlotte’s work explores what futures are possible and desirable but also what we are at risk of losing. AI's role in healthcare can have harmful effects but also huge potential benefits. She emphasises the need to raise vital questions and consider consequences. Awareness of nuances, understanding risks and avoiding unethical technology is key because this touches upon high stake decisions (literally about life and death).“We need to be in the often uncomfortable spaces in between - at least for a while”Constant critical engagement and trust calibration is needed because there are no quick fixes for these new emerging issues - for example how much can we augment until humans are only operators - and is this something we want to strive towards?She closes her talk with a both hopeful and cautious remark. She advocates daring to dream of AI-specialist collaboration, while maintaining agency, responsible exchange and upholding medical ethics and common goals.</itunes:summary>
            <itunes:subtitle>How can AI be used for the common good or more precisely in healthcare? Charlotte’s work explores what futures are possible and desirable but also what we are at risk of losing. AI's role in healthcare can have harmful effects but also huge...</itunes:subtitle>
            <itunes:author>Video Archive – The Conference by Media Evolution</itunes:author>
            <itunes:duration>13:52</itunes:duration>
            <media:description type="html">&lt;p&gt;&lt;span&gt;&lt;p&gt;How can AI be used for the common good or more precisely in healthcare? Charlotte’s work explores what futures are possible and desirable but also what we are at risk of losing. AI's role in healthcare can have harmful effects but also huge potential benefits. She emphasises the need to raise vital questions and consider consequences. Awareness of nuances, understanding risks and avoiding unethical technology is key because this touches upon high stake decisions (literally about life and death).&amp;nbsp;&amp;nbsp;&lt;/p&gt;&lt;br&gt;&lt;p&gt;&lt;b&gt;“We need to be in the often uncomfortable spaces in between - at least for a while”&lt;/b&gt;&lt;/p&gt;&lt;br&gt;&lt;p&gt;Constant critical engagement and trust calibration is needed because there are no quick fixes for these new emerging issues - for example how much can we augment until humans are only operators - and is this something we want to strive towards?&lt;/p&gt;&lt;br&gt;&lt;p&gt;She closes her talk with a both hopeful and cautious remark. She advocates daring to dream of AI-specialist collaboration, while maintaining agency, responsible exchange and upholding medical ethics and common goals.&lt;/p&gt;&lt;div&gt;&lt;span&gt;&lt;br&gt;&lt;/span&gt;&lt;/div&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;a href="http://videos.theconference.se/charlotte-hogberg-the-doctor-is"&gt;&lt;img src="http://videos.theconference.se/64968566/88203554/c2bc5d032669fb4fbfc812fd5812ef78/standard/download-7-thumbnail.jpg" width="75" height=""/&gt;&lt;/a&gt;&lt;/p&gt;</media:description>
            <media:content url="//videos.theconference.se/v.ihtml/player.html?token=c2bc5d032669fb4fbfc812fd5812ef78&amp;source=podcast&amp;photo%5fid=88203554" width="625" height="352" type="text/html" medium="video" duration="832" isDefault="true" expression="full"/>
            <media:thumbnail url="http://videos.theconference.se/64968566/88203554/c2bc5d032669fb4fbfc812fd5812ef78/standard/download-7-thumbnail.jpg" width="75" height=""/>
            <itunes:image href="http://videos.theconference.se/64968566/88203554/c2bc5d032669fb4fbfc812fd5812ef78/standard/download-7-thumbnail.jpg/thumbnail.jpg"/>
            <category>2023</category>
            <category>ai</category>
            <category>humanity (ai)mplified</category>
        </item>
        <item>
            <enclosure url="http://videos.theconference.se/64968578/88204257/875adba576b4286d67aacd05611ed19a/video_medium/ovetta-sampson-design-principles-video.mp4?source=podcast" type="video/mp4" length="78619888"/>
            <title>Ovetta Sampson -  Design Principles for a Pluralist Automated Future</title>
            <link>http://videos.theconference.se/ovetta-sampson-design-principles</link>
            <description>&lt;p&gt;&lt;span&gt;&lt;p&gt;&lt;b&gt;"Data is the love language of machine learning, but we must remember that it is not true."&lt;/b&gt;&lt;/p&gt;&lt;br&gt;&lt;p&gt;We all create data. And all data is created by people. Ovetta Sampson wants us to remember this, both in order to centre humanity but also to clarify the vulnerabilities of data. We are biased, so the data we create is infused with biases as well. Whether it is by the sin of omission or the use of inequitable variables, traumatised datasets manifest in real world situations such as applying for a bank loan or decisions made on housing and education.&lt;/p&gt;&lt;br&gt;&lt;p&gt;Ovetta urges particular caution for the encounters between humans and machines in the era of AI and machine learning. It's not Skynet, not yet, but ceding decision making responsibility to such systems can lead to harmful consequences. The best way of countering these resides in responsible, human-centred design frameworks which capture the minimum viable data, maintain balance in the exchange of what people give and what they receive, and include iterative privacy by default.&lt;/p&gt;&lt;br&gt;&lt;p&gt;Ovetta ends with a rigorous set of responsible design practices to combat the amplification of our human biases by AI systems.&lt;/p&gt;&lt;div&gt;&lt;span&gt;&lt;br&gt;&lt;/span&gt;&lt;/div&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;a href="http://videos.theconference.se/ovetta-sampson-design-principles"&gt;&lt;img src="http://videos.theconference.se/64968578/88204257/875adba576b4286d67aacd05611ed19a/standard/download-8-thumbnail.jpg" width="75" height=""/&gt;&lt;/a&gt;&lt;/p&gt;</description>
            <guid>http://videos.theconference.se/photo/88204257</guid>
            <pubDate>Fri, 01 Sep 2023 17:11:12 GMT</pubDate>
            <media:title>Ovetta Sampson -  Design Principles for a Pluralist Automated Future</media:title>
            <itunes:summary>"Data is the love language of machine learning, but we must remember that it is not true."We all create data. And all data is created by people. Ovetta Sampson wants us to remember this, both in order to centre humanity but also to clarify the vulnerabilities of data. We are biased, so the data we create is infused with biases as well. Whether it is by the sin of omission or the use of inequitable variables, traumatised datasets manifest in real world situations such as applying for a bank loan or decisions made on housing and education.Ovetta urges particular caution for the encounters between humans and machines in the era of AI and machine learning. It's not Skynet, not yet, but ceding decision making responsibility to such systems can lead to harmful consequences. The best way of countering these resides in responsible, human-centred design frameworks which capture the minimum viable data, maintain balance in the exchange of what people give and what they receive, and include iterative privacy by default.Ovetta ends with a rigorous set of responsible design practices to combat the amplification of our human biases by AI systems.</itunes:summary>
            <itunes:subtitle>"Data is the love language of machine learning, but we must remember that it is not true."We all create data. And all data is created by people. Ovetta Sampson wants us to remember this, both in order to centre humanity but also to clarify the...</itunes:subtitle>
            <itunes:author>Video Archive – The Conference by Media Evolution</itunes:author>
            <itunes:duration>19:16</itunes:duration>
            <media:description type="html">&lt;p&gt;&lt;span&gt;&lt;p&gt;&lt;b&gt;"Data is the love language of machine learning, but we must remember that it is not true."&lt;/b&gt;&lt;/p&gt;&lt;br&gt;&lt;p&gt;We all create data. And all data is created by people. Ovetta Sampson wants us to remember this, both in order to centre humanity but also to clarify the vulnerabilities of data. We are biased, so the data we create is infused with biases as well. Whether it is by the sin of omission or the use of inequitable variables, traumatised datasets manifest in real world situations such as applying for a bank loan or decisions made on housing and education.&lt;/p&gt;&lt;br&gt;&lt;p&gt;Ovetta urges particular caution for the encounters between humans and machines in the era of AI and machine learning. It's not Skynet, not yet, but ceding decision making responsibility to such systems can lead to harmful consequences. The best way of countering these resides in responsible, human-centred design frameworks which capture the minimum viable data, maintain balance in the exchange of what people give and what they receive, and include iterative privacy by default.&lt;/p&gt;&lt;br&gt;&lt;p&gt;Ovetta ends with a rigorous set of responsible design practices to combat the amplification of our human biases by AI systems.&lt;/p&gt;&lt;div&gt;&lt;span&gt;&lt;br&gt;&lt;/span&gt;&lt;/div&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;a href="http://videos.theconference.se/ovetta-sampson-design-principles"&gt;&lt;img src="http://videos.theconference.se/64968578/88204257/875adba576b4286d67aacd05611ed19a/standard/download-8-thumbnail.jpg" width="75" height=""/&gt;&lt;/a&gt;&lt;/p&gt;</media:description>
            <media:content url="//videos.theconference.se/v.ihtml/player.html?token=875adba576b4286d67aacd05611ed19a&amp;source=podcast&amp;photo%5fid=88204257" width="625" height="352" type="text/html" medium="video" duration="1156" isDefault="true" expression="full"/>
            <media:thumbnail url="http://videos.theconference.se/64968578/88204257/875adba576b4286d67aacd05611ed19a/standard/download-8-thumbnail.jpg" width="75" height=""/>
            <itunes:image href="http://videos.theconference.se/64968578/88204257/875adba576b4286d67aacd05611ed19a/standard/download-8-thumbnail.jpg/thumbnail.jpg"/>
            <category>2023</category>
            <category>ai</category>
            <category>humanity (ai)mplified</category>
        </item>
        <item>
            <enclosure url="http://videos.theconference.se/64968575/88204003/33c2e83c3545b59455a472a84e31878d/video_medium/laura-herman-the-human-creative-video.mp4?source=podcast" type="video/mp4" length="41906795"/>
            <title>Laura Herman - The Human Creative Director: Remixing, Seeing, Curating</title>
            <link>http://videos.theconference.se/laura-herman-the-human-creative</link>
            <description>&lt;p&gt;&lt;span&gt;&lt;p&gt;&lt;b&gt;“What would you make if you did not have to generate it?”&lt;/b&gt;&lt;/p&gt;&lt;br&gt;&lt;p&gt;Global platforms like Instagram and Tik Tok enable creatives to show their work to wider audiences. However, these platforms also operate using algorithms that determine which content appears in users' feeds and what remains unseen.&amp;nbsp;&lt;/p&gt;&lt;br&gt;&lt;p&gt;This dynamic has significant implications: firstly, algorithms replace the work that elite institutions such as museums and art galleries traditionally have done. We tend to explore what grabs our attention instead of spending time with art pieces that might need more time to examine and understand. Secondly, recommendation systems influence the way creatives show their work. Collaborators in Laura’s projects shared that they sometimes publish works that will perform well but not necessarily challenge them as artists or show their best work.&amp;nbsp;&lt;/p&gt;&lt;br&gt;&lt;p&gt;&lt;b&gt;“How would you describe your taste to a machine?”&lt;/b&gt;&lt;/p&gt;&lt;br&gt;&lt;p&gt;This has an influence on our perception of the world and on the evolving role of humans as creators. Laura believes generative AI tools can be seen as an additional tool in the creatives’ toolbox. Along with this, there is a shift of focus from production to curation.&amp;nbsp;&lt;/p&gt;&lt;p&gt;Coming up with ideas and thoughtful concepts is becoming more crucial than the production process itself as this can be (partially or entirely) assisted by generative AI tools. Taste and critically selecting criterias will become sought-after creative skills. She closes her talk with provoking questions like how art pieces look different depending on who they were aimed to be created for.&lt;/p&gt;&lt;div&gt;&lt;br&gt;&lt;/div&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;a href="http://videos.theconference.se/laura-herman-the-human-creative"&gt;&lt;img src="http://videos.theconference.se/64968575/88204003/33c2e83c3545b59455a472a84e31878d/standard/download-7-thumbnail.jpg" width="75" height=""/&gt;&lt;/a&gt;&lt;/p&gt;</description>
            <guid>http://videos.theconference.se/photo/88204003</guid>
            <pubDate>Fri, 01 Sep 2023 17:10:50 GMT</pubDate>
            <media:title>Laura Herman - The Human Creative Director: Remixing, Seeing, Curating</media:title>
            <itunes:summary>“What would you make if you did not have to generate it?”Global platforms like Instagram and Tik Tok enable creatives to show their work to wider audiences. However, these platforms also operate using algorithms that determine which content appears in users' feeds and what remains unseen.This dynamic has significant implications: firstly, algorithms replace the work that elite institutions such as museums and art galleries traditionally have done. We tend to explore what grabs our attention instead of spending time with art pieces that might need more time to examine and understand. Secondly, recommendation systems influence the way creatives show their work. Collaborators in Laura’s projects shared that they sometimes publish works that will perform well but not necessarily challenge them as artists or show their best work.“How would you describe your taste to a machine?”This has an influence on our perception of the world and on the evolving role of humans as creators. Laura believes generative AI tools can be seen as an additional tool in the creatives’ toolbox. Along with this, there is a shift of focus from production to curation.Coming up with ideas and thoughtful concepts is becoming more crucial than the production process itself as this can be (partially or entirely) assisted by generative AI tools. Taste and critically selecting criterias will become sought-after creative skills. She closes her talk with provoking questions like how art pieces look different depending on who they were aimed to be created for.</itunes:summary>
            <itunes:subtitle>“What would you make if you did not have to generate it?”Global platforms like Instagram and Tik Tok enable creatives to show their work to wider audiences. However, these platforms also operate using algorithms that determine which content...</itunes:subtitle>
            <itunes:author>Video Archive – The Conference by Media Evolution</itunes:author>
            <itunes:duration>15:21</itunes:duration>
            <media:description type="html">&lt;p&gt;&lt;span&gt;&lt;p&gt;&lt;b&gt;“What would you make if you did not have to generate it?”&lt;/b&gt;&lt;/p&gt;&lt;br&gt;&lt;p&gt;Global platforms like Instagram and Tik Tok enable creatives to show their work to wider audiences. However, these platforms also operate using algorithms that determine which content appears in users' feeds and what remains unseen.&amp;nbsp;&lt;/p&gt;&lt;br&gt;&lt;p&gt;This dynamic has significant implications: firstly, algorithms replace the work that elite institutions such as museums and art galleries traditionally have done. We tend to explore what grabs our attention instead of spending time with art pieces that might need more time to examine and understand. Secondly, recommendation systems influence the way creatives show their work. Collaborators in Laura’s projects shared that they sometimes publish works that will perform well but not necessarily challenge them as artists or show their best work.&amp;nbsp;&lt;/p&gt;&lt;br&gt;&lt;p&gt;&lt;b&gt;“How would you describe your taste to a machine?”&lt;/b&gt;&lt;/p&gt;&lt;br&gt;&lt;p&gt;This has an influence on our perception of the world and on the evolving role of humans as creators. Laura believes generative AI tools can be seen as an additional tool in the creatives’ toolbox. Along with this, there is a shift of focus from production to curation.&amp;nbsp;&lt;/p&gt;&lt;p&gt;Coming up with ideas and thoughtful concepts is becoming more crucial than the production process itself as this can be (partially or entirely) assisted by generative AI tools. Taste and critically selecting criterias will become sought-after creative skills. She closes her talk with provoking questions like how art pieces look different depending on who they were aimed to be created for.&lt;/p&gt;&lt;div&gt;&lt;br&gt;&lt;/div&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;a href="http://videos.theconference.se/laura-herman-the-human-creative"&gt;&lt;img src="http://videos.theconference.se/64968575/88204003/33c2e83c3545b59455a472a84e31878d/standard/download-7-thumbnail.jpg" width="75" height=""/&gt;&lt;/a&gt;&lt;/p&gt;</media:description>
            <media:content url="//videos.theconference.se/v.ihtml/player.html?token=33c2e83c3545b59455a472a84e31878d&amp;source=podcast&amp;photo%5fid=88204003" width="625" height="352" type="text/html" medium="video" duration="921" isDefault="true" expression="full"/>
            <media:thumbnail url="http://videos.theconference.se/64968575/88204003/33c2e83c3545b59455a472a84e31878d/standard/download-7-thumbnail.jpg" width="75" height=""/>
            <itunes:image href="http://videos.theconference.se/64968575/88204003/33c2e83c3545b59455a472a84e31878d/standard/download-7-thumbnail.jpg/thumbnail.jpg"/>
            <category>ai</category>
            <category>humanity (ai)mplified</category>
        </item>
        <item>
            <enclosure url="http://videos.theconference.se/64968576/88204564/92fbe3e076d1cc65639ff7f730a862d9/video_medium/qa-memory-in-the-machine-video.mp4?source=podcast" type="video/mp4" length="54709869"/>
            <title>Q&amp;A – Memory in The Machine</title>
            <link>http://videos.theconference.se/qa-memory-in-the-machine</link>
            <description>&lt;p&gt;Q&amp;amp;A from the session Memory in The Machine – The Tools We Use to Archive Us with&amp;nbsp;Carl Öhman (Uppsala University) and Neef Rehman (Creative technologist, Isometric)&amp;nbsp;&lt;div&gt;&lt;br&gt;&lt;/div&gt;&lt;/p&gt;&lt;p&gt;&lt;a href="http://videos.theconference.se/qa-memory-in-the-machine"&gt;&lt;img src="http://videos.theconference.se/64968576/88204564/92fbe3e076d1cc65639ff7f730a862d9/standard/download-7-thumbnail.jpg" width="75" height=""/&gt;&lt;/a&gt;&lt;/p&gt;</description>
            <guid>http://videos.theconference.se/photo/88204564</guid>
            <pubDate>Fri, 01 Sep 2023 17:06:21 GMT</pubDate>
            <media:title>Q&amp;A – Memory in The Machine</media:title>
            <itunes:summary>QA from the session Memory in The Machine – The Tools We Use to Archive Us withCarl Öhman (Uppsala University) and Neef Rehman (Creative technologist, Isometric)</itunes:summary>
            <itunes:subtitle>QA from the session Memory in The Machine – The Tools We Use to Archive Us withCarl Öhman (Uppsala University) and Neef Rehman (Creative technologist, Isometric)</itunes:subtitle>
            <itunes:author>Video Archive – The Conference by Media Evolution</itunes:author>
            <itunes:duration>18:16</itunes:duration>
            <media:description type="html">&lt;p&gt;Q&amp;amp;A from the session Memory in The Machine – The Tools We Use to Archive Us with&amp;nbsp;Carl Öhman (Uppsala University) and Neef Rehman (Creative technologist, Isometric)&amp;nbsp;&lt;div&gt;&lt;br&gt;&lt;/div&gt;&lt;/p&gt;&lt;p&gt;&lt;a href="http://videos.theconference.se/qa-memory-in-the-machine"&gt;&lt;img src="http://videos.theconference.se/64968576/88204564/92fbe3e076d1cc65639ff7f730a862d9/standard/download-7-thumbnail.jpg" width="75" height=""/&gt;&lt;/a&gt;&lt;/p&gt;</media:description>
            <media:content url="//videos.theconference.se/v.ihtml/player.html?token=92fbe3e076d1cc65639ff7f730a862d9&amp;source=podcast&amp;photo%5fid=88204564" width="625" height="352" type="text/html" medium="video" duration="1096" isDefault="true" expression="full"/>
            <media:thumbnail url="http://videos.theconference.se/64968576/88204564/92fbe3e076d1cc65639ff7f730a862d9/standard/download-7-thumbnail.jpg" width="75" height=""/>
            <itunes:image href="http://videos.theconference.se/64968576/88204564/92fbe3e076d1cc65639ff7f730a862d9/standard/download-7-thumbnail.jpg/thumbnail.jpg"/>
            <category>2023</category>
            <category>memory in the machine</category>
        </item>
        <item>
            <enclosure url="http://videos.theconference.se/64968570/88204086/ac742c32d796cb51af676e496708ddf8/video_medium/neef-rehman-machine-forgetting-video.mp4?source=podcast" type="video/mp4" length="47595798"/>
            <title>Neef Rehman  - Machine Forgetting: Memory as Instruction &amp; the Fallacy of Time</title>
            <link>http://videos.theconference.se/neef-rehman-machine-forgetting</link>
            <description>&lt;p&gt;&lt;span&gt;&lt;b&gt;“The anthropomorphisation of AI is not the way to go. It is more interesting to look at how that impacts our interaction and perception of time.”&lt;/b&gt;&lt;br&gt;&lt;br&gt;&lt;/span&gt;&lt;span&gt;&lt;p&gt;What does time look like for machines? Do machines understand time the way humans perceive it? And what happens when we rely on machines that have their own view of the world and on us? Neef Rehman discusses how the concept of time is unique to us and shaped by the people around us. But in an era of increasing interdependence, time as we know it — fluid, fallible and social — is challenged by generative, non-reliable agents that blur the past and the future.&amp;nbsp;&lt;/p&gt;&lt;br&gt;&lt;p&gt;Machine Forgetting is a phenomenon that we need to remind ourselves more of, argues Neef. Why? As we rely on our extended network of technology and learned machines to remind us of memories or events, we must accept that these agents are not as shiny as they present themselves. They are messy, biased and will forget things — just like humans. So let’s not spend time feeling gaslit by these agents but radically accept that no one has certainty over memory.&lt;/p&gt;&lt;div&gt;&lt;span&gt;&lt;br&gt;&lt;/span&gt;&lt;/div&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;a href="http://videos.theconference.se/neef-rehman-machine-forgetting"&gt;&lt;img src="http://videos.theconference.se/64968570/88204086/ac742c32d796cb51af676e496708ddf8/standard/download-7-thumbnail.jpg" width="75" height=""/&gt;&lt;/a&gt;&lt;/p&gt;</description>
            <guid>http://videos.theconference.se/photo/88204086</guid>
            <pubDate>Fri, 01 Sep 2023 17:06:10 GMT</pubDate>
            <media:title>Neef Rehman  - Machine Forgetting: Memory as Instruction &amp; the Fallacy of Time</media:title>
            <itunes:summary>“The anthropomorphisation of AI is not the way to go. It is more interesting to look at how that impacts our interaction and perception of time.”What does time look like for machines? Do machines understand time the way humans perceive it? And what happens when we rely on machines that have their own view of the world and on us? Neef Rehman discusses how the concept of time is unique to us and shaped by the people around us. But in an era of increasing interdependence, time as we know it — fluid, fallible and social — is challenged by generative, non-reliable agents that blur the past and the future.Machine Forgetting is a phenomenon that we need to remind ourselves more of, argues Neef. Why? As we rely on our extended network of technology and learned machines to remind us of memories or events, we must accept that these agents are not as shiny as they present themselves. They are messy, biased and will forget things — just like humans. So let’s not spend time feeling gaslit by these agents but radically accept that no one has certainty over memory.</itunes:summary>
            <itunes:subtitle>“The anthropomorphisation of AI is not the way to go. It is more interesting to look at how that impacts our interaction and perception of time.”What does time look like for machines? Do machines understand time the way humans perceive it? And...</itunes:subtitle>
            <itunes:author>Video Archive – The Conference by Media Evolution</itunes:author>
            <itunes:duration>16:16</itunes:duration>
            <media:description type="html">&lt;p&gt;&lt;span&gt;&lt;b&gt;“The anthropomorphisation of AI is not the way to go. It is more interesting to look at how that impacts our interaction and perception of time.”&lt;/b&gt;&lt;br&gt;&lt;br&gt;&lt;/span&gt;&lt;span&gt;&lt;p&gt;What does time look like for machines? Do machines understand time the way humans perceive it? And what happens when we rely on machines that have their own view of the world and on us? Neef Rehman discusses how the concept of time is unique to us and shaped by the people around us. But in an era of increasing interdependence, time as we know it — fluid, fallible and social — is challenged by generative, non-reliable agents that blur the past and the future.&amp;nbsp;&lt;/p&gt;&lt;br&gt;&lt;p&gt;Machine Forgetting is a phenomenon that we need to remind ourselves more of, argues Neef. Why? As we rely on our extended network of technology and learned machines to remind us of memories or events, we must accept that these agents are not as shiny as they present themselves. They are messy, biased and will forget things — just like humans. So let’s not spend time feeling gaslit by these agents but radically accept that no one has certainty over memory.&lt;/p&gt;&lt;div&gt;&lt;span&gt;&lt;br&gt;&lt;/span&gt;&lt;/div&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;a href="http://videos.theconference.se/neef-rehman-machine-forgetting"&gt;&lt;img src="http://videos.theconference.se/64968570/88204086/ac742c32d796cb51af676e496708ddf8/standard/download-7-thumbnail.jpg" width="75" height=""/&gt;&lt;/a&gt;&lt;/p&gt;</media:description>
            <media:content url="//videos.theconference.se/v.ihtml/player.html?token=ac742c32d796cb51af676e496708ddf8&amp;source=podcast&amp;photo%5fid=88204086" width="625" height="352" type="text/html" medium="video" duration="976" isDefault="true" expression="full"/>
            <media:thumbnail url="http://videos.theconference.se/64968570/88204086/ac742c32d796cb51af676e496708ddf8/standard/download-7-thumbnail.jpg" width="75" height=""/>
            <itunes:image href="http://videos.theconference.se/64968570/88204086/ac742c32d796cb51af676e496708ddf8/standard/download-7-thumbnail.jpg/thumbnail.jpg"/>
            <category>2023</category>
            <category>memory in the machine</category>
        </item>
        <item>
            <enclosure url="http://videos.theconference.se/64968579/88203494/ee75115b64c26b297cca60e7d19b071f/video_medium/carl-ohman-the-ethics-of-our-video.mp4?source=podcast" type="video/mp4" length="44829349"/>
            <title>Carl Öhman – The Ethics of Our Digital Afterlives </title>
            <link>http://videos.theconference.se/carl-ohman-the-ethics-of-our</link>
            <description>&lt;p&gt;&lt;span&gt;&lt;p&gt;&lt;span&gt;&lt;b&gt;“The data of the dead is more than individual user history, it is the heritage of the 21st century.”&lt;/b&gt;&lt;br&gt;&lt;/span&gt;&lt;br&gt;By the end of this century, Facebook will host 5 billion profiles of deceased people - and therefore have access to data of more people who are dead than alive. That poses an urgent question: What do we do with the digital dead?&lt;/p&gt;&lt;p&gt;Since the agricultural revolution and the settlement of humans, the dead have been around us. They are a portal to our past, and we continue to feel connected to them. Due to the digital revolution, the people who die now also leave a digital footprint. That leads to a question of morality: What do we do with this enormous amount of data? Who owns it, who accesses it, who deletes it? Who can profit from it economically? All of this is more than a question of individual user history but the actual heritage of the 21st century. So, how do we treat this primary source of information that we pass down to the future?&lt;/p&gt;&lt;p&gt;Should we really let 1-2 tech companies gatekeep the access to our collective digital past? Should we grant families access to our data after we die? And can we influence which version of us will be displayed after our death? Carl Öhman encourages us to think about these questions because even though you may not care about your data after your death because you are dead, there are various reasons why you absolutely should.&lt;/p&gt;&lt;div&gt;&lt;br&gt;&lt;/div&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;a href="http://videos.theconference.se/carl-ohman-the-ethics-of-our"&gt;&lt;img src="http://videos.theconference.se/64968579/88203494/ee75115b64c26b297cca60e7d19b071f/standard/download-8-thumbnail.jpg" width="75" height=""/&gt;&lt;/a&gt;&lt;/p&gt;</description>
            <guid>http://videos.theconference.se/photo/88203494</guid>
            <pubDate>Fri, 01 Sep 2023 17:05:56 GMT</pubDate>
            <media:title>Carl Öhman – The Ethics of Our Digital Afterlives </media:title>
            <itunes:summary>“The data of the dead is more than individual user history, it is the heritage of the 21st century.”By the end of this century, Facebook will host 5 billion profiles of deceased people - and therefore have access to data of more people who are dead than alive. That poses an urgent question: What do we do with the digital dead?Since the agricultural revolution and the settlement of humans, the dead have been around us. They are a portal to our past, and we continue to feel connected to them. Due to the digital revolution, the people who die now also leave a digital footprint. That leads to a question of morality: What do we do with this enormous amount of data? Who owns it, who accesses it, who deletes it? Who can profit from it economically? All of this is more than a question of individual user history but the actual heritage of the 21st century. So, how do we treat this primary source of information that we pass down to the future?Should we really let 1-2 tech companies gatekeep the access to our collective digital past? Should we grant families access to our data after we die? And can we influence which version of us will be displayed after our death? Carl Öhman encourages us to think about these questions because even though you may not care about your data after your death because you are dead, there are various reasons why you absolutely should.</itunes:summary>
            <itunes:subtitle>“The data of the dead is more than individual user history, it is the heritage of the 21st century.”By the end of this century, Facebook will host 5 billion profiles of deceased people - and therefore have access to data of more people who are...</itunes:subtitle>
            <itunes:author>Video Archive – The Conference by Media Evolution</itunes:author>
            <itunes:duration>16:05</itunes:duration>
            <media:description type="html">&lt;p&gt;&lt;span&gt;&lt;p&gt;&lt;span&gt;&lt;b&gt;“The data of the dead is more than individual user history, it is the heritage of the 21st century.”&lt;/b&gt;&lt;br&gt;&lt;/span&gt;&lt;br&gt;By the end of this century, Facebook will host 5 billion profiles of deceased people - and therefore have access to data of more people who are dead than alive. That poses an urgent question: What do we do with the digital dead?&lt;/p&gt;&lt;p&gt;Since the agricultural revolution and the settlement of humans, the dead have been around us. They are a portal to our past, and we continue to feel connected to them. Due to the digital revolution, the people who die now also leave a digital footprint. That leads to a question of morality: What do we do with this enormous amount of data? Who owns it, who accesses it, who deletes it? Who can profit from it economically? All of this is more than a question of individual user history but the actual heritage of the 21st century. So, how do we treat this primary source of information that we pass down to the future?&lt;/p&gt;&lt;p&gt;Should we really let 1-2 tech companies gatekeep the access to our collective digital past? Should we grant families access to our data after we die? And can we influence which version of us will be displayed after our death? Carl Öhman encourages us to think about these questions because even though you may not care about your data after your death because you are dead, there are various reasons why you absolutely should.&lt;/p&gt;&lt;div&gt;&lt;br&gt;&lt;/div&gt;&lt;/span&gt;&lt;/p&gt;&lt;p&gt;&lt;a href="http://videos.theconference.se/carl-ohman-the-ethics-of-our"&gt;&lt;img src="http://videos.theconference.se/64968579/88203494/ee75115b64c26b297cca60e7d19b071f/standard/download-8-thumbnail.jpg" width="75" height=""/&gt;&lt;/a&gt;&lt;/p&gt;</media:description>
            <media:content url="//videos.theconference.se/v.ihtml/player.html?token=ee75115b64c26b297cca60e7d19b071f&amp;source=podcast&amp;photo%5fid=88203494" width="625" height="352" type="text/html" medium="video" duration="965" isDefault="true" expression="full"/>
            <media:thumbnail url="http://videos.theconference.se/64968579/88203494/ee75115b64c26b297cca60e7d19b071f/standard/download-8-thumbnail.jpg" width="75" height=""/>
            <itunes:image href="http://videos.theconference.se/64968579/88203494/ee75115b64c26b297cca60e7d19b071f/standard/download-8-thumbnail.jpg/thumbnail.jpg"/>
            <category>2023</category>
        </item>
    </channel>
</rss>
