Since I started at the RCA, I’ve found myself working to understand the underlying principles that govern digital product development. Less about how these products are built–in terms of architecture or innovation in code capabilities–but why each system is built the way it is, how that drives specific experiences or interactions, and how that directly affects individual users.
The problem is by calling them users, designers tend to forget that they’re people.
My research began by examining how social media is affecting the mental health of users, but evolved to looking at how designers are perpetuating undesirable habits through the digital products they create. As I worked–and continue to work–to understand how these systems are built and used, my scope necessarily expanded to include the concept of “big data” and privacy–seeing as the two are inextricably linked, data being the key driver for every major interaction that takes place on any digital platform.
The more I work to understand these systems myself, the more I can see both the potential for harm, and the potential for immense good–but without people, none of it matters. We’ve failed in educating our “users” about why this pursuit is important and what kind of progress it can bring. But, by simply watching the news we can see in real-time what happens when those same “users” have lost all trust in a system that governs their lives.
Creating transparency in these systems is the key to collective understanding, and is the only way to push towards a healthier data future, not only for individuals or companies, but for the future of humanity. Check out more at (https://cpugsley.com/)