Agnis Stibe
0
Agnis Stibe
0
Filtering by: workshop

Human Artificial Intelligence
Nov
14
9:00 PM21:00

Human Artificial Intelligence

  • Transformative Technology Conference (map)
  • Google Calendar ICS

Transformative Technology Conference

Welcome to join this community breakout session on Human Artificial Intelligence at the annual Transformative Technology Conference.

Tremendous achievements are demonstrating our humane ability to evolve and progress. On this journey, let's remember that all the key answers are already encoded in our basic nature. Thus, artificial intelligence is only something that can serve us for the next big leap of the mankind. Technology is nothing more than a huge mirror that reflects all the good and the bad sides of what we have inherited as human beings.

Artificial intelligence is learning from what we can offer.

Developer bias is at the core of all our expectations and speculations for the future of man-machine co-evolution.

View Event →
Regenerative Humanity
Apr
14
5:00 PM17:00

Regenerative Humanity

Visions that Compose a New Reality

We invite you to the fourth week of Wisdom Health, where we are going to share different visions of some of the people who have accompanied us in this first stage, to compose a new reality. To achieve a cross-linked dialogue that gives birth to a common message, integrating what is happening to us, what we can learn from the past and what moves us committing ourselves to a regenerated future.

Wisdom Health is an emerging community of conscious leaders that promotes the relationships sustaining the process of regeneration and cohesion in vulnerable times. Focusing on three main areas:

  • Being with our Present - Regenerative Humanity

  • Learning from our Past - Ancient Wisdom Resilience

  • Committed to our Future - Humane Economy and Technology

View Event →
Transforming IMBA Hackathon
Dec
13
9:00 AM09:00

Transforming IMBA Hackathon

Global Business Challenges

Prof. Agnis Stibe invites his students to the Transforming IMBA Hackathon for addressing Global Business Challenges.

This was a unique experience, as for one full day all of the Transforming IMBA students addressed some of the most urgent global challenges within these three:

CATEGORIES

  1. Sustainable Business (19) - creating businesses that leave positive footprint on the planet.

  2. Human Flourishing (6) - helping people and societies to have happy and fulfilling lives.

  3. Emerging Economies (3) - facilitating and supporting emerging markets to succeed faster.

CRITERIA

  1. Impact - the size of impact - how many people impacted?

  2. Scale - how easy it can be scaled? - can it cover the whole planet?

  3. Viability - is there a real need? - is there a validated necessity?

  4. Disruptiveness - how futuristic is your solution? - how far from an ordinary thinking?

TEAMS

  • Registered their team name and members.

  • Had maximum 3 people.

  • Contained students from any of the IMBA groups.

SOLUTIONS

  • Recorded as a short video presentation or demonstration.

  • Each video less than 3 minutes long.

  • Easy video recording and editing tool Screencast-O-Matic.

  • All video uploaded to YouTube.

  • All the names of team members are explicitly written in the video description on YouTube.

AGENDA

  • 09:00 - Welcome Zoom Call

  • 09:10 - Introduction to the Categories

  • 09:20 - Teams Formation

  • 09:30 - Hackathon Starts

  • 17:00 - Time to Upload your Videos to YouTube

  • 17:30 - All Videos Uploaded and their Links Added

  • 17:40 - Closing Zoom Call

  • 18:00 - Hackathon Ends

GRADING

  • All videos evaluated and ranked.

  • The best performing videos receiving top grades.

ENCOURAGEMENT

  • Getting your enthusiasm, creativity, and devices ready for this memorable experience!

  • Any questions?

View Event →
STATION F Startup Meetup
Jan
30
6:00 PM18:00

STATION F Startup Meetup

How Transformation Actually Works to Improve Lives and Accelerate Businesses?

If you don't know how transformation works, this exclusive event is right for you! 

You will:

  • Apply 8 practical tools from the Transforming Toolset.

  • Practice making business and life-changing transformations.

  • Experience your own personal paradigm shifts.

  • Bring your real-life challenges to solve them right there.

  • See the end of all failed transformations.

  • Finally succeed with your new year's resolutions.

Enjoy your time with the global thought leader on science-driven transformation!

STATION F is the world’s biggest startup campus - the only startup campus gathering a whole entrepreneurial ecosystem under one roof.

View Event →
Transforming Wellbeing Theory
Sep
20
1:00 PM13:00

Transforming Wellbeing Theory

Permanently Shifting Human Behavior and Attitude

In this session, you will learn 8 transforming tools that you will be able to apply immediately to any of your challenges related to human behavior and attitude.

Have you ever tried to change something in yourself or others? How many times you have had a New Year’s resolution that succeeds? Most of us strive for better lives and try our best to achieve changes. However, we rarely get to celebrate victories in really transforming our habits, behaviors, and thoughts, thus changing our lives and businesses for good.

This session blends science and practice to help participants gaining rich understanding on how transformation works, what are its essential components, how to design and apply influential strategies, what novel technologies (e.g. artificial intelligence, augmented reality, and advanced sensing) are effectively facilitating change process, and what to do first thing each morning.

Most of us already want changes for better. What we oftentimes miss is to know how to make such transformations succeed. This session helps you develop and internalize the necessary competences.

View Event →
Uncovering Dark Patterns in Persuasive Technology
Apr
17
2:00 PM14:00

Uncovering Dark Patterns in Persuasive Technology

Welcome to the 'Uncovering Dark Patterns in Persuasive Technology' (#DPPT2018) Workshop

The dark patterns are interactive design patterns that influence technology users through deception or trickery, and which represent unethical applications of persuasive technology. However, our ability to identify dark patterns is limited, creating a situation where it is difficult to manage abuses of persuasive psychology, because it is difficult to even identify them. Although there are numerous practitioner taxonomies of dark patterns, there is no scientifically-based taxonomy available.

In this workshop, participants enjoyed an introduction to dark patterns, and an overview of the psychological mechanisms that drive them.

Intention-Outcome matrix adopted from Stibe and Cugelman.

Through participatory exercises, participants helped to identify the theoretical-underpinnings that drive dark patterns, and contribute to the development of a taxonomy of dark patterns, based on consensus within the scientific community. In the workshop, we formed working teams who reviewed the dark pattern taxonomy, looking for alternative theoretical explanations. Each working team participated in a group sorting exercise, designed to inform the development of a theoretically-framed taxonomy of dark patterns. All outputs of the workshop were captured, and used to advance this study towards validation of the taxonomy. After the workshops, the authors of this paper plan to incorporate all the advancements into the next stage of the research, which will feed into a subsequent paper on a taxonomy of dark patterns, addressing the identified research questions.

Agenda

13:30 Introduction (20 min)
13:50 Presentations (40 min)

14:30 Break (15 min)

14:45 Card Sorting (45 min)
15:30 Live Analysis of Card Sort (15 min)

15:45 Break (15 min)

16:00 Unifying Dark Patterns Taxonomy (30 min)
16:30 Presenting Unified Theory (20 min)
16:50 Award Ceremony (5 min)
16:55 Next Steps (5 min)

Organizers

Look forward to see you all on April 17, 2018, in Waterloo, ON, Canada:


Background

Dark patterns are interactive design patterns that influence technology users through deception, trickery or hostility, that make their lives difficult or contribute a negative impact, through intended or unintended design practices that represent unethical applications of persuasive technology [2]. Examples of dark patterns include getting people to purchase unnecessary insurance, signing up for products without knowing they are on recurring billing, exposing users to content that makes them feel bad about themselves in order to influence their behavior, environmental designs that are effectively ‘hostile’ to particular groups such as the homeless, cyclists or pedestrians. 

Industry professionals have raised public awareness of dark patterns, and have taken steps to identify, collect, and describe dark patterns. Their efforts have helped draw attention to dark patterns [6]. However, there is no comprehensive practitioner taxonomy, they tend to lack a strong theoretical basis, and the taxonomies put forward draw on colloquial terms, rather than frameworks typically employed by the behavioral sciences.

Importance

There is a practical need to develop a science-based taxonomy of dark patterns that makes stronger links to behavioral science principles [4]. The existing taxonomies [3] can be greatly improved, through taking a more systematic approach that better identifies the theoretical underpinnings. We extend our definition of persuasive technology beyond the digital to illustrate a long history of darkness in persuasion, our idea is that our taxonomy should be holistic and applicable to different persuasive contexts. As the Internet of Things reaches further into our lives by digitally connecting physical objects and infrastructures, the persuasive effects of smart cities, smart transport and smart homes are likely to be profound.

For this reason, we have been carrying out a systematic review of information about programs, apps, behavioral and environmental designs that may contain dark patterns. This study is aimed at improving our understanding of persuasive technology by better describing what dark patterns are, identifying their theoretical underpinnings, and developing precise definitions that will both describe how to identify them, and how to describe when they are being applied [5]. We are also considering the ethical status of dark patterns in persuasive technology [1].

Goals and Research Direction

This workshop will support two goals. First, it provides an opportunity for participants to share knowledge regarding dark patterns, expressed appropriately, for example using visual depictions where necessary, and to discuss the psychological principles that explain their persuasiveness. Second, it will use participatory method, not just to engage participants, but also, to advance this project's goals of developing a taxonomy of dark patterns, that is more rooted in the behavioral science, by making stronger links between theory and practice. Participants will also be equipped to consider whether the use of dark patterns is ever ethically acceptable.

For the workshop, we have prepared a list of research questions to be discussed and further investigated:

  • What taxonomy of dark patterns aligns with the behavioral sciences?

  • What are the psychological mechanism driving dark patterns?

  • How do dark patterns differ from anti-patterns (backfires, misfires, outcomes of poor design)?

  • Why do people believe a design pattern constitutes a dark pattern?

  • How to define the boundaries from persuasive to manipulative to coercive?

  • Are there shades of darkness, from grey to black? How do we detect instances of dark patterns?

  • When does a dark pattern and a psychological backfire overlap?

  • What are the emotional impacts of dark patterns?

Examples

We have gathered real-life examples from persuasive and environmental technology that we or our informants regard as dark. Arguably, whilst users still have choices, by linking these examples to behavioral theory we intend to show that unconscious processes are triggered first and more strongly than the conscious processes required to make informed choices. The balance of knowledge is asymmetric in favor of giving the provider more power.

Structure and Outcomes

The workshop will start with an introduction followed by presentations on cognate topics, designed to provide background for the interactive exercise, overall:

  • Ethics of persuasive technology,

  • Resistance against manipulative behavior (trust in source, oxytocin and emotion),

  • Grey areas in our understanding of dark patterns.

  • Taxonomy of dark patterns (overview sheet and cards),

  • Interactive classification exercise (like delphi or Q-method),

  • Review of the data, discuss, and clean up the taxonomy based on group consensus.

Prior to the workshop, our team will have completed a systematic review of dark patterns, collecting examples in narrative and visual format, drawing on academic and practitioner sources. These will be aggregated into a list, that includes both simple examples with clear links to persuasion theory, and those that are more complex and not as easy to identify. Using expert review and consensus building, our team will develop the first taxonomy, that links dark patterns to principles routinely used in behavioral science, based on a grounded theory methodology, finalized through an expert review and consensual agreement. The final output will be a list of dark patterns, linked to theory and with an example of each pattern. These will be printed onto cards, so that each table in the workshop will have a list of dark patterns.

In the workshops, we will form working teams who will review the dark pattern taxonomy, looking for alternative theoretical explanations. Each working team will participate in a group sorting exercise, designed to inform the development of a theoretically-framed taxonomy of dark patterns. All outputs of the workshop will be captured, and used to advance this study towards validation of the taxonomy. After the workshops, the authors of this paper will incorporate all the advancements into the next stage of the research, which will feed into a subsequent paper on a taxonomy of dark patterns, addressing the identified research questions.

References

  1. Berdichevsky, D., Neuenschwander, E.: Toward an Ethics of Persuasive Technology, Communications of the ACM, 42 (5), 51-58 (1999).

  2. Fogg, B.J.: Persuasive Technology: Using Computers to Change What We Think and Do. San Francisco: Morgan Kaufmann (2003).

  3. Spahn, A.: And Lead Us (not) into Persuasion…? Persuasive Technology and the Ethics of Communication. Science and Engineering Ethics, 18(4), 633-650 (2012).

  4. Stibe, A. and Cugelman, B.: Persuasive Backfiring: When Behavior Change Interventions Trigger Unintended Negative Outcomes. In International Conference on Persuasive Technology, pp. 65-77. Springer (2016).

  5. Verbeek, P.P.: Persuasive Technology and Moral Responsibility Toward an Ethical Framework for Persuasive Technologies. Persuasive, 6, 1-15 (2006).

  6. Zagal, J. P., Björk, S., Lewis, C.: Dark Patterns in the Design of Games. In Foundations of Digital Games (2013)

View Event →
Persuasive Technology: Making a Difference Together
Apr
17
9:00 AM09:00

Persuasive Technology: Making a Difference Together

Merged with the Transforming Sociotech Design (TSD) Tutorial

About

This workshop will discuss the research efforts that are being made aimed at changing human behavior and attitude. It will engage the persuasive technology community to jointly look at where do we stand and where do we want to go with the field. In 2018, it will be fifteen years since the seminal book on persuasive technology was published. Since then, already twelve annual conferences have been held on the topic. The Persuasive Technology community has attracted many young scholars and has kept very strong core of leading scientists in the area of research. At the same time, not all expectations have been met over the last decade. Therefore, the community needs to come together and discuss ways for natural expansion and strategic growth. We need acknowledge weaknesses in the area of behavior change interventions and seek for ways to overcome them.

This workshop will help facilitating discourses around human behavior, behavior change, early interventions for behavior change, persuasive technology, persuasive systems design, design with intent, personalized persuasion, behavior change support systems, health behavior change, socially influencing systems, user experience design for behavior change, computer-supported influence, and more. The workshop will discuss open questions, promote a healthy debate amongst academics, create strategic directions, and unify everyone around what’s essential for advancing the community in a fundamentally fresh and novel way.

Register

To register your participation, please visit the Transforming Sociotech Design (TSD) tutorial page.

Organizers

Look forward to see you all on April 17, 2018, in Waterloo, ON, Canada:


Opening

People have unique beliefs and values that shape up their personalities over time. However, not many act in accordance with their beliefs and values. It is not surprising to find a contradiction between peoples’ beliefs and actual actions. Such inconsistencies gave birth to the Theory of Cognitive Dissonance [1]. Indeed, it was this particular gap in peoples’ beliefs and actual actions that was recognized by academics, psychologists and researchers leading to the manifest role of Persuasive Technology to shape up human behavior.

Advancing

While several scholars studied human behavior and early interventions were designed to guide users through behavior change process [2,3], Brendryen and Kraft proposed that technology-based interventions had the potential to change people’s behaviors [4]. In 2003, Fogg introduced a new research area known as Persuasive Technology [5]. His work originates from Human Psychology and hence it is essential to understand the interplay between Psychology and Technology when interventions are developed to shape up human behavior. The research field of Persuasive Technology highlights the potential of technology as a tool for persuasion where the earlier acts both as a medium and a social actor [5]. Following Fogg’s work, researchers from around the globe started developing and analyzing persuasive technologies for a wide range of areas including but not limited to promotion of physical activity [6], saving energy [7], living happily [8], reducing soda consumption [9], managing mental disorders [10] and persuasive cities [11].

Learning from Success!

Available research largely provides evidence of learning from success. In other words, it is relatively hard to find scientific publications in the area of Persuasive Technology that highlight failures. This compels us to think whether we as researchers can learn from success only? Or is it so that our research settings are flawless that our research outcomes are always positive? It remains a fact that real knowledge is verified knowledge in a way that the knowledge base should be proven by intelligence or by (logical) evidence. Further, scholarly integrity in any research discipline demands that researchers should abstain from any unverified remarks [12]. In other words, we must disown biased and speculative results. We propose that the same should be practiced in the research field of Persuasive Technology. Persuasive Technology has received a great deal of attention from researchers who have developed stand-alone applications to promote desirable behaviors. However, a quick look at the previous proceedings indicates that researchers are still focused on application-driven studies with little attention to theoretical grounds. Hence there is a lack of balance between studying technologies and theories to support the work.

Bias?

Another area that calls for discussion is an evident lack of publications that has highlight failures. This is in line with a review of empirical studies by [13] who investigated a variety of persuasive information systems and reported that reviewed studies primarily reported fully positive and partially positive effects [13]. We argue that partial reason for absence of publications that report failures is because of publication bias that pertains to acceptance of only those manuscripts that have statistically significant level of results while all other submissions are more or less rejected. Similar reservations have also been put forward by [14].

This, in a way, is publication suppression that obstructs what could otherwise prove to be quality papers from being accepted. When it comes to Persuasive Technology this would result in serious inaccuracy rates in available literature. There is substantial evidence that convinces the existence of publication bias. Banks and colleagues propose that the degree of publication bias has grown to such an extent that available research results are unreliable of all research. Further, they highlight that publication bias is one of the greatest threats to the legitimacy of meta-analytic articles, which in turn are among the most significant instruments for advancing scientific research [14]. There could be several reasons for publication bias. One being authors’ decision. In simple terms, authors have more control of their data. A classic example would be a situation where authors would not submit their work because of small sample size, statistically insignificant results or because of findings that contradict previous research.

Heading Where?

The issue of publication bias applies to almost all the research disciplines and the research area of Persuasive Technology is no exception. Here, we would highlight another issue that is similar to publication bias. This critical issue is what we generally call as conflict of interest. If we go through the proceedings of all the conferences on Persuasive Technology, it becomes crystal clear that prominent names seem to appear both in the scientific committees as well as in the list of authors of accepted publications. This is a clear case of conflict of interest, one would argue. While there is no substitute for experience and we can never underestimate the contribution of senior researchers, yet it seems relatively clear that the research area of Persuasive Technology is in what might call as “rigid control” of a few. As an example, consider the International Conference on Persuasive Technology. One would notice that a high majority of Steering Committee remain the same and secondly, most of them have at least one paper published in the proceedings.

Outcomes

The proposed workshop aims to bring researchers together to a forum that facilitates constructive discussion and debate. The research area of Persuasive Technology is receiving increasing interest from across the globe and deservingly so. Yet it is observed that the audience at Persuasive Technology conferences revolves mainly around the same crowd with a few exceptions. It is anticipated that the workshop will provide an opportunity for researchers from different disciplines to address the issue and come up with constructive recommendations leading to a change for the advancement of Persuasive Technology Community.

We welcome topics included but not limited to:

  • Theory-driven persuasive design

  • Publishing failures

  • Multidisciplinary contributions

  • Publication bias

  • Attracting larger audience

  • Strategic steering

References

  1. Festinger, L. A (1957). A Theory of Cognitive Dissonance, Stanford, CA: Stanford University Press.

  2. Revere, D., & Dunbar, P. J. (2001). Review of Computer-generated Outpatient Health Behavior Interventions Clinical Encounters “in Absentia”. Journal of the American Medical Informatics Association, 8(1), 62-79.

  3. Reiter, E., Robertson, R., & Osman, L. M. (2003). Lessons from a failure: Generating tailored smoking cessation letters. Artificial Intelligence, 144(1), 41-58.

  4. Brendryen, H., & Kraft, P. (2008). Happy Ending: A randomized controlled trial of a digital multi‐media smoking cessation intervention. Addiction, 103(3), 478-484.

  5. Fogg, B. J. (2003). Computers as persuasive social actors.

  6. Toscos, T., Faber, A., An, S., & Gandhi, M. P. (2006, April). Chick clique: persuasive technology to motivate teenage girls to exercise. In CHI'06 extended abstracts on Human factors in computing systems (pp. 1873-1878). ACM.

  7. Midden, C., & Ham, J. (2009, April). Using negative and positive social feedback from a robotic agent to save energy. In Proceedings of the 4th international conference on persuasive technology (p. 12). ACM.

  8. Chatterjee, S., & Price, A. (2009). Healthy living with persuasive technologies: framework, issues, and challenges. Journal of the American Medical Informatics Association, 16(2), 171-178.

  9. Langrial, S., & Oinas-Kukkonen, H. (2012). Less fizzy drinks: a multi-method study of persuasive reminders. In Persuasive Technology. Design for Health and Safety (pp. 256-261). Springer Berlin Heidelberg.

  10. Langrial, S., Oinas-Kukkonen, H., Lappalainen, P., & Lappalainen, R. (2014, May). Managing depression through a behavior change support system without face-to-face therapy. In International Conference on Persuasive Technology (pp. 155-166). Springer, Cham.

  11. Stibe, A., & Larson, K. (2016). Persuasive cities for sustainable wellbeing: quantified communities. In International Conference on Mobile Web and Information Systems (pp. 271-282). Springer International Publishing.

  12. Lakatos, I. (1976). Falsification and the methodology of scientific research programmes (pp. 205-259). Springer Netherlands.

  13. Hamari, J., Koivisto, J., & Pakkanen, T. (2014). Do Persuasive Technologies Persuade?-A Review of Empirical Studies. In Persuasive Technology (pp. 118-136).

  14. Banks, G. C., Kepes, S., & McDaniel, M. A. (2012). Publication Bias: A call for improved meta‐analytic practice in the organizational sciences. International Journal of Selection and Assessment, 20(2), 182-196.

  15. Available at: http://scitechconnect.elsevier.com/why-science-needs-to-publish-negative-results/?utm_source=socialmedia&utm_medium=All&utm_campaign=Why%20Science%20Needs%20to%20Publish%20Negative%20Results&sf8382783=1. Accessed on April 15, 2015.

View Event →