{"id":223,"date":"2024-03-27T09:57:50","date_gmt":"2024-03-27T09:57:50","guid":{"rendered":"https:\/\/wp.lancs.ac.uk\/caiss\/?p=223"},"modified":"2024-03-27T09:57:50","modified_gmt":"2024-03-27T09:57:50","slug":"why-algorithms-pick-up-on-our-biases","status":"publish","type":"post","link":"https:\/\/wp.lancs.ac.uk\/caiss\/2024\/03\/27\/why-algorithms-pick-up-on-our-biases\/","title":{"rendered":"Why Algorithms pick up on our biases"},"content":{"rendered":"\n<div class=\"twitter-share\"><a href=\"https:\/\/twitter.com\/intent\/tweet?via=caiss_uk\" class=\"twitter-share-button\">Tweet<\/a><\/div>\n<div class=\"page\" title=\"Page 4\">\n<div class=\"section\">\n<div class=\"layoutArea\">\n<div class=\"column\">\n<p>Why do algorithms pick up on our biases? It could be argued that this is due to a 95 year old economic model that assumes people\u2019s preferences can be revealed by looking at their behaviour. However, the choices we make are not always what would be best for us. We might have a great wish list on our Netflix account which reflects our true interests, but watch the \u201ctrashy\u201d shows that are easier to click on that Netflix sends us. All algorithms are built on what the user is doing, making predictions rather than realistic assumptions as revealed preferences can be incomplete and even misleading. Should algorithms be built with a move away from revealed preferences and encompass more behavioural science? Would this lead to an improvement in our welfare? Or do we just need to watch something \u201ctrashy\u201d to de-stress at the end of the day?<\/p>\n<p>SOURCE: <a href=\"https:\/\/www.nature.com\/articles\/s41562-023-01724-4#:~:text=Some%20of%20these%20biases%20are,and%20may%20be%20unaware%20of.\">Nature Human Behaviour<\/a><\/p>\n<\/div>\n<\/div>\n<\/div>\n<\/div>\n","protected":false},"excerpt":{"rendered":"<p>Why do algorithms pick up on our biases? It could be argued that this is due to a 95 year old economic model that assumes people\u2019s preferences can be revealed&hellip; <a href=\"https:\/\/wp.lancs.ac.uk\/caiss\/2024\/03\/27\/why-algorithms-pick-up-on-our-biases\/\" class=\"more-link\">Continue reading <span class=\"screen-reader-text\">Why Algorithms pick up on our biases<\/span><\/a><\/p>\n","protected":false},"author":1669,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_jetpack_memberships_contains_paid_content":false,"footnotes":""},"categories":[5],"tags":[],"class_list":["post-223","post","type-post","status-publish","format-standard","hentry","category-byte","without-featured-image"],"jetpack_featured_media_url":"","jetpack_sharing_enabled":true,"_links":{"self":[{"href":"https:\/\/wp.lancs.ac.uk\/caiss\/wp-json\/wp\/v2\/posts\/223","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/wp.lancs.ac.uk\/caiss\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/wp.lancs.ac.uk\/caiss\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/wp.lancs.ac.uk\/caiss\/wp-json\/wp\/v2\/users\/1669"}],"replies":[{"embeddable":true,"href":"https:\/\/wp.lancs.ac.uk\/caiss\/wp-json\/wp\/v2\/comments?post=223"}],"version-history":[{"count":1,"href":"https:\/\/wp.lancs.ac.uk\/caiss\/wp-json\/wp\/v2\/posts\/223\/revisions"}],"predecessor-version":[{"id":224,"href":"https:\/\/wp.lancs.ac.uk\/caiss\/wp-json\/wp\/v2\/posts\/223\/revisions\/224"}],"wp:attachment":[{"href":"https:\/\/wp.lancs.ac.uk\/caiss\/wp-json\/wp\/v2\/media?parent=223"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/wp.lancs.ac.uk\/caiss\/wp-json\/wp\/v2\/categories?post=223"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/wp.lancs.ac.uk\/caiss\/wp-json\/wp\/v2\/tags?post=223"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}