ODI Fridays: Digital mental health: finding signals, respecting noise and dealing with uncertainty



welcome to the ODI Fridays we do this every Friday you're welcome to join here in our offices or also from online you can always always follow us in hash tag ODI Fridays for more upcoming new events also to streamline your questions at the moment after throughout the sector to engage digitally or not by this coming here so today's talk is gonna be presented by Becky Inkster she'll talk to you about digital health and more if you have questions please hold them until the end in our behind handing around the mic thank you for your time and it's a great pleasure to be here so today I'm going to be talking about topics that are very sensitive they're very controversial and so I appreciate your patience as we go through these these discussions and so I've spent about 20 years in the field of mental health looking at it from different lenses I'm not an expert in any one particular field or area but I'm just extremely curious across different sectors and different ways of looking at mental health so those are my disclaimers and I have no conflicts of interest three words that mean a lot to me our integrity imagination and inclusivity these are my sort of current formal affiliations and so you can just see them listed there but I like to see myself more through visual means and enjoying the Arts a 3d printing of jewelry and and other ways of expressing myself and my interest in in mental health so I suppose just briefly I started in genomics and neuroimaging Big Data these types of areas and realize that I wanted to make more impact directly with people I have experience running what we call randomized control trials with the drug erythropoietin but again I wanted to make more impact through arts music culture policy etc so I've been sort of trying to do that over the course of twenty years this is my boss so I'm usually at home in the Hey in the sandpit so my partner's here today looking after her she's nothing like AI she does not follow commands as well as as bots do so I'm going to be talking a little bit about my own experiences because I don't always do that but I feel at the open data Institute this is a good space for me to be able to open up and be a little bit vulnerable about some of the experiences that I've had and I'm going to highlight the field which is just ridiculous I kind of try and do that in five minutes so be patient and we can I continue our discussions afterwards so yes for example I'm very interested in creative behaviors and no need to apologize and then I will be calling for your support in many ways on different projects I've got a lot of ideas that I can't tackle on my own I work with wonderful people and I'm always looking for more people to work with and these things in the blue box matter a lot to me and I am a data scientist and many other things but I sort of grapple these trade-offs and how they can sit together and contrast each other but in the world of mental health our effect sizes and our signals are very very small they're very personal they are different within ourselves across time and across people and we have a lot of noise which sometimes I don't describe as noise and so we have to think about the individual and personalized precision health but we also have to think about aggregated information and the population and that's really tricky methodologically but also how voices you can lose a signal when you combine too many voices or you can gain strength in Sigma I think about active versus passive both in terms of monitoring or surveillance but also intervention and I think we we tend to do passive monitoring or we think more along those lines and active interventions but I've been playing in a space where engaging with active monitoring and passive intervention so I can give you a few examples later classification versus prediction I won't go into that too much because I'm sure a lot of you are familiar but we really struggle with this in mental health where accuracy can get very high when you're trying to separate groups or subgroups but prediction is very very low as I'll show through one example and linkage verses privacy and privacy preserving technologies obviously a huge issue and a lot of people are sort of moving through different platforms and tools and and the data is moving and just learning about that longitudinal signal I think is really important what we're missing how are able to capture it appropriately with consent synthetic data versus real date I think is is going to be very interesting in mental health controversial as well but will give us an opportunity to at least try to explore with more numbers because it's sometimes we are very challenging when we work with small datasets or when we have issues that are very rare events so we can start to think bigger than the planet so to speak and then in the clinic versus in the wild so what are the trade-offs here when you work with an anonymous bot you don't have access to certain things you want to protect the privacy which might boost the signal but you can't regress those certain things like gender and and other factors so I'm I'm just looking at these trade-offs when do we use whichever environments so my my own experience is just to highlight a few here very quickly but I'm happy to discuss at any point and at one point in my career I had an issue with data quality and I didn't really know what to do but it turned out that opening the window closing the window doing things clicking the same thing all the time led to massive fluctuations and that jeopardized not only my findings but the entire group's findings and it led to buying equipment that cost lots of money so it was the way that I had to once I realized and how I then approached my group to say this is some bad news and how using numbers to actually show this it really sort of shaped how I feel I can share information with people and by showing these types of things so that the data quality and has to be very high for me and I then I'm a co-founder of hip hop psych with dr. Kim Sewell a consultant psychiatrist and we had a viral article it was 1.3 million hits in 72 hours and this led to a lot of attention and it also led me to believe that if not of an impact factor so yes it was lenses circuitry great editor great journal and but it was about impact so I received a letter from a prisoner who was incarcerated for murder and I was going through this letter and is a very interesting experience how he's turned his life around within prison and he's becoming a psychotherapist and trying to train and all these things so he appreciated the work we were doing so very complicated things I'll skip forward quite quickly but got very close to the Cambridge analytic issue in in Cambridge I'm a fellow in the department Sakaya tree and made some really good choices and I had a really interesting conversation with the Wellcome Trust and I argued with them in a constructive way to ask for less money which is interesting but to repurpose the money for a cause that I believed in so initially I said well I'll take nothing and then we got there eventually so ethical entrepreneurship and these things getting a little taste for how to do things differently then standard kind of academia I did a poll for at the American Psychiatric Association and found some really surprising striking findings about data privacy and sharing and so that was quite an interesting experience how to handle that and then on a more positive note being involved with the Instagram the removal of the promotional graphic self-harm images you might have seen or heard about again I find this really interesting because the the experts in the world of suicide prevention weren't sorry they were divided to some extent so it was very interesting to see even with that extreme scenario that there there's always a gray area and when the press release was announced I asked for a specific phrase to be edited which Facebook didn't really want to do so my name I removed it from the actual statement itself so I think it's we have to be careful with what our name goes on if it doesn't follow our beliefs and you should stand behind that that print poor and and be very ethical today hopefully I won't overrun too much but there are three areas where I would love support I would love guidance because they're controversial and I want to do good I care about mental health I care about people and and these are intense issues I find that I have to take more risk and go into more risky areas in order to bring the mental health conversation into those spaces so I'd love your input I'll skip this but just to say very briefly I've approached 40 editors five of which are willing to change their abstract outline to include something called ethical discoveries we haven't yet fully gone through that so if you're interested in hearing more and getting involved and pushing this to become more real then that would be great but basically it's not just about the IRB or the wreck things happen after you get your ethical approval and these are the kind of gold nuggets and the very interesting bits of information that we should be putting into our papers we should be telling people that we have an algorithm that scrubs faces if you're just doing head counts that reduces your file size that reduces the you save money or if you are working with Twitter data on a gang member who's deceased you know what is the issue about consent one solution could be community consent and how did you go about doing that so in one sentence I think we can inspire academics to learn from other experiences that we face but just don't really get talked about so again I'd love your involvement there this is a whole bunch of tiny pictures of technology digital mental health in the interest of time I won't go into too much detail but we can see anything from digital social prescribing online prescriptions and sort of therapeutics or you can have treatment online chatbots virtual reality even drug discovery using VR sensors everywhere you could imagine the switch that could hug you if you fall you know etc etc tangible interfaces and so on so we're not short on solutions I think is what I'm trying to say here in 2013 if I had said to Siri and that I'm thinking or I'm considering ending my life and jumping off a bridge this would have been the response from a mental health perspective this is deeply upsetting and have we come further from that point I'm not so sure I want to think we have come a far distance but there's so much more we need to do and just by speaking with some colleagues and and people that I really respect sometimes phrases come out that really surprised me so like the model was better than nothing in mental health that's not necessarily true sometimes doing nothing is better than risking harm okay so that kind of made me a bit nervous someone's saying who prefers life to death or who prefers death to life and asking to people to raise their hands again this is so sensitive that it just did not fit right with with me and even within digital mental health you know there there are situations where a company said something like we have known privately for some time so they're just holding back it just sounds a bit creepy so yeah I'm guessing a lot of you have seen this but just to illustrate the point that mental health can be invisible and so when you add the smallest fraction of noise in to mental health data it's it's not easy to fix this error we know that 99.3% it is not a given but the symptoms the feelings the narratives can can be heavily changed or manipulated and you might have seen this as well but sensors again mental health this is a very clear example of getting it wrong getting it really really wrong a lack of diversity in tech but person of color putting their hand under the soap dispenser the sensor doesn't recognize that whereas a Caucasian hand it does or a white paper towel it does so imagine again now the discrimination and the bias in mental health our signals are I would say invisible are very very difficult to find and methodologies I think we really need to kind of step up this in mental health just to give one example very briefly there was some site prediction models asking the question are we are we ready to do this and the accuracy of predicting a future event was near zero I have to skip this but I'm very happy to discuss it in in great detail you may know the dead same an example in big data in our imaging but the dead salmon was showing activation that responded to happy faces or emotional faces and again I know this is a crude example but it just shows thresholding and all kinds of things that we need to do with the data it's very easy to get it wrong and obviously this was a long time ago and it didn't hurt just for multiple comparisons but we can see modern problems where we could get it wrong I won't go into all the details here but there's a lot of issues that we face when we're looking with longitudinal data and group data etc especially with mental health again you want to rule out a physical condition a thyroid condition of something else that may be masking it as a mental health problem but it's actually a physical problem so where is the physical examination when we go digital for mental health other issues for example diagnostic manuals in different continents they don't agree all the time so when you're trying to scale your digital mental health solution you can see here the customer says love the app you need a great app but you don't recognize that I overeat and I oversleep and I don't think that's good it doesn't seem healthy but you're not using well anyway they were using a particular diagnostic manual that didn't recognize that so we have to be so careful about scale and when we go international to countries that don't even recognize or sorry I should say suicide is a crime and you could go to prison so when we scale things become a lot Messier and everyone is so quick to want to scale and of course you could have a diagnosis within 14 days this type of phrase that's just very damaging and I think should be there should be legal penalties for this but trust seems to be there in digital mental health which is which is encouraging so there's lots of various examples but I just chose this one supporting veterans three times more likely to open up to this sort of virtual assistant compared to the gold standard of an anonymous questionnaire so I think that's it's interesting we've got taskbar social BOTS but the future in terms of the mental health bats we're starting to see this emerge there are various things there are some playful ones there are some using generative models I'm using more safe scripted models so this will be an issue about how we actually use AI and and the information the it's not just about the Turing test it is also about the safety test so what do we want the bots saying and who has to okay this in advance it's interesting that yeah a lot of people while 63% of I think it was less than 5,000 young people and they would be comfortable with that giving them a diagnosis and I think we need a lot more research in this space and I'll leave it at that but but I can see some times where young people are coming from in in this context I'll skip this but again just to highlight that retrieval based models generative models we just have to be a little bit careful but complex and unseen queries and how we actually balance that fine line between you know pushing our our algorithms to behave but also respecting that we need the human control of this model as well so in mental health you don't want to not remember something someone said five steps ago if it's going to trigger the trauma or you don't want to ask them something again or it's a no-go topic but at the same time we just must be so careful because it's easy for BOTS to get it wrong and say nasty things this we know we need to modernize the tools that we're putting into the technology it's not just that tech is moving fast we have to move the the tools like the cognitive behavioral therapies move these along as well and find a pace that works for all of these the clinical end and the technology there are many things that we'll need to think about with BOTS especially if they're going to be a big part of the future I'm sorry the looking at like poor effective decision libraries and just the idea of artificial personalities playing a role and how how that's going to be embedded in a mental health context that matters such a such a great deal okay so I'm going to jump into three examples and I hope I don't run out of time but I already kind of Here I am so I might pick up the pace here I love this not because I said this but it's just so true one person's noise there's so much stigma around hip-hop and mental health but one person's noise is another person's signal and I think with hip-hop this is so so true and I wanted to show you a couple things that we've been up to so we probably are expanding a couple decades if you search for words that doctors would say you basically see a flatline because no one's rapping about promethazine and things like that yet but then you start to look at so that's rap word frequency and then across time in hip-hop lyrics but let's ask young people what they're saying note that ecstasy the yellow line is flat and Molli which is what young people would use as a term spikes before 2010 there are a lot of socio-political trends happening here I can tell you a lot of geographical things and very interesting things from this data whereas I couldn't with this again object recognition this is codeine and promethazine this is scissor so knowing what you're looking for you have to understand the context to get the signal xanex I'm sure you've heard of news stories especially in England lately has been on the rise obviously there's a frequency thing going on there but and zan and and just other ways of abbreviated that but to amplify that with social media when you get 360 million views these words are are being amplified out to young people so this is this is a tricky thing so advertising and promoting these negative messages and this is extremely amplified in social media we've got to explore this in more detail on the dark web I'm very interested in exploring this space why young people are going to the dark web to self-medicate and to find solutions as opposed to going through other available channels and and so I mentioned active monitoring so I'm working with some excellent colleagues some really great people including key changes and what we're doing is we're building trust before we put sensors anywhere near anyone and you can see here this is heart rate across time now they're writing their songs we just have instrumental beat they can write whatever they want they can edit their track they sort of get prepared they maybe do some other stuff they go into the booth the recording studio and then they have a post take it's a very interesting pattern in their in their heart rate here now I'm just focusing on the writing the initial stage we can see resting writing begins pauses thinking etc etc there's a sudden stop something happens to the heart rate writing continues heart rate drops everyone's different there's a lot of patterns but each each person has their own way and we can also look at what they're writing so it's very interesting and this person's talking about the mania they're experiencing and not being able to sleep so mental health is very much a part of hip-hop culture we know that it also affects the brain when you freestyle and I'm working with some excellent people as well sunday on lyrical combat and what we're trying to do here is that people can free stop and they have key words that they can put into their responses and what's interesting is that the algorithm scores this and it's trying to build verbal dexterity or verbal skills and young people and this is a big thing which I won't go into but related to mental house psychosis schizophrenia negative symptoms so what we're really trying to do is is help mental health but making it really fun and engaging and we can cognitively load it with the words or we can emotionally load it we can help them by making it pair and things like that so and it's social you can share it or you can keep it private but we know that vocal psychopathology is is an area that we have to explore mental health and it's showing some interesting indicators and other diseases especially when you emotionally engage the individual or cognitively engaged individual a signal gets more intense there are many ethical things that we'll be considering which I'll skip but we can discuss now the second topic that I wanted to just cover briefly again asking for any support is looking at more acoustic passive interventions that are not happening now to my knowledge but they need to be tested for either youth in mental health or not use in mental health and I had a good conversation earlier that 15 kilohertz is what you the sound for loitering teenagers that sweats or gets rid of them but we're in the ultra audible range so the pink signal there and yeah when you stop playing with sound really creepy things happen so this is a paper from 1947 I think published in 48 and yeah I won't discuss this in too much detail but it's very disturbing what sound can actually do to animals to people involving heating and it's incredibly dangerous in the wrong context so we have to be extremely ethical and what we're trying to do is almost borrow the metaphor of the visual QR code where you can actually embed information into that but this is a sonic an audio QR code so you can embed things into the algorithm data into the sound again this comes with great risk because it's ultra audible you can't hear it and information is being transferred or sent in that space that's just one example this is a very busy slide so ignore it but just to show you that we're doing some preliminary work looking at EEG it's not the best tool but it's a tool and we'll go through many more types but these ultra audible signals are going into the brain about 20% of the people were able to actually even perceive this ultra audible in a different way so we are very keen to understand the ethics before anything else is even done in this field so I welcome any thoughts and suggestions on on how we should move forward very carefully in this space I think it's a really good opportunity for anticipatory policies or regulations so let's get prepared for the event the possibility or even just the yeah the possible future of these acoustic interventions and just that individual differences really matter here from a genomics perspective from a psychological perspective and so we need to just make sure we understand everyone's experience and monitoring so to protect if if someone wanted to ever use this for bad how can we create a system that can be aware of this as well so as I said it's not necessarily an important thing unless it benefits men of mental health and if it harms then we we still have a purpose and a duty to regulate and to really share and and figure out what to do with this and the third controversial issue and area that I'm exploring is mental health and financial data so we know that there's a relationship here and I put out a call to action in Lancet digital health on Monday and there are many reasons just listing a few here and there all listed in the paper but there is a relationship between financial difficulties and mental health naturally you would I'm sure you can imagine that and and we're seeing the emergence of these FinTech banks these challenger banks and they're collecting a lot of the financial phenotyping information and what are they doing and what is it used for and there's a lot of data that they're building rather than just being worried about it ignoring it I want to actually try and bring them to a conversation with mental health and see if we can do good with this information somehow we see initial signs from the FinTech world that they're trying to help customers the to enable them to block their own transactions so for a 48-hour period you can block gambling you can do things so it's it's a good gesture but I think they need to extend their duty of care and we need a linkage across mental health and care and so yeah there's some great examples out there of ways you can leverage this but I think we need to have a really good conversation so I'm trying to create a consortium bringing people together of how we can put this information together for good and I don't mean just taking bank details and intimate electronic health records I mean that we can also create BOTS and embed financial support into the bots and we can we can really try and use leverage the mental health field and bring finance into that space a little bit more always being sure about consent and that the users are in control and so just to end here and to say I would love your help there's a lot of different topics and I certainly am not doing this all by myself so are we always looking for people who are interested in developing in any of this work further and if you're interested in digital mental health I have a conference a few more spots left but the 13th of 13th and 14th of August and it's in London and I run it every summer so please get in touch and thank you for your time thank you that was great lots of information I'm sure there's lots of questions as though so if before just to keep in mind the mic amplifies does not amplify your voice it's just just for recording so people who are so just recording so don't put it later so put it in close your mouth speak anybody would like to start with a question or two and if not I'm happy to just start something so in my area one of the biggest problem I'm sorry a big problem is wellness versus more the clinical end of things and the wellness industry I think the global mark is four point two trillion and I worry that some people might develop tools that nudge people who are more or less okay but to worry about their their health we call this the worried well and to manipulate and make people buy products or influence their decisions so I I'm very keen to make sure there's transparency about who's doing what for and what are their business models and things like that yeah that's a really interesting point because I remember reading and asking a while ago saying that apps like headspace another mindful apps were pushing people more into into more anxious states than then taking them out so is that does that reflect what you've just said or yeah like it always comes with the disclaimer of everyone is different so what's right for someone would be wrong etcetera and that kind of thing but I do agree and I think one way of adding some transparency which were not yet really discussing is what is the business model if you're going to have a business what are your incentives and and how do you balance that with trying to actually help people I love to I think in the next year really start going around and asking the companies I don't have any conflicts of interest and I just want to learn more about how they can stay sustainable to help people but and stay afloat but to you know make sure every penny goes toward something positive is there any data that's held by the private sector or a particularly tricky sector for getting hold of information from that you'd like to be able to access to either support you in your research or to help others develop tools I think well the biggest one would be the fin tech or the financial sector and how to do that in the most sensitive way I alone won't have like the answers but I'm hoping that people will be able to brainstorm ways whether it be data enclaves or ways of linking data without it truly touching or or briefly in time or some ways and I know that someday two experts are really keen on that but again this is sensitive it's a mental health and finance data and it's very easy to be scared about bringing those two things together so I would love to bring that out because I do think in that date at the financial data there could be some really interesting social determinants upstream predictors or really powerful ways of saying this person might want help if they are in control of all the decision-making and they've chosen in advance to want that to engage with that so I would say that that comes with great risk but if we could find a way to do it it'd be really good for mentor house we do a lot of work around data ethics and considering ethics at the ADI would you say that the work that you do and the impact that you're trying to have should fall within ethical considerations you know kind of considering the the unknown unknowns and I guess how do we do that matter how do we kind of build that in because we've obviously bought a day to ethics canvas yes I don't know you know have a look and see if if mental health would kind of fit within that or considerations around mental health fit within that I think one strategy is to partner with people like the ODI to hold hands very tightly so that we don't we don't go astray or we you know we are always making sure that ethics is side by side or in the case of the ultra audible which hasn't even really started at the front so where do we set limits and boundaries before we even push forward so I think it's extremely important to to be ethical and personally I think so working with partners who also share those goals is really important when lines get crossed and when banks or other sectors want to go a different way then I think it has to be very transparent that that's the the incentives have split and the outcome measures and everything is is not agreed and yeah so it's not a perfect world but I want to just try and have a little slice or angle things change the dial a little bit and that I'd be happy with with that as well the digital culture is also something that is plays big part in digital health in a way right because we're all online we're all participate we all share or not and there's definitely a lot of people who are the influencers will create quanta like this huge waves of follow-up and potentially harm or benefit notes like how could they play a role in you know yeah so there's yeah micro influencers and emotional contagion and this gray area of regulation in the social kind of world and I think it's really interesting because a lot of it slips through the cracks so there it really comes if you had my heart rate monitor you're here don't worry it's Clifton Clifton shoot you're here but yeah no I think it's unregulated and so the responsibility lies on the individual to really you know take responsibility of their message but sometimes when people are bonding to you as a social-media leader you don't know what to say you don't know what to do and that can be very scary and so how your message comes out but there's an opportunity I think to work with influencers and and to raise awareness and that's happening with already with lots of people so it's a good in the bad again just trying to always do good yeah I've just got a very broad question because it's quite a new area digital innovation for mental health who is the community who tends to be interested in this kind of question psychiatrists and healthcare professionals I'm trying to okay I'm gonna say I'm gonna say to mix it is a mix of course and that's what I love about the community so I'm building community here and there are other pockets of communities and we just keep trying to build and but it's a mix so art this might be a mental health experiment I'm not quite sure I'm not involved with it so don't cite me maybe testing our patience I don't know but yeah it is a mix and I think that's the beauty of it sometimes we get art therapists who design digital app tools when they have chronic treatments so for example the person here has a chronic illness and experienced a lot of pain and so she designed or she works with digital art because at one point she could only really use her hand in a high level of comfort so and so yeah I would say it brings together everyone because mental health is everywhere or I see it as everywhere and every topic and so it touches all of our lives but I would say health care are of the obvious yeah interest and tech as well but I think texts sometimes just kind of go and try and yeah so a mix I would say okay so it's the health and mental health is very clear like well even like the mental health so mental health nurses psychiatrists and and that sort of thing I think they're really keen to find a ways to increase access to to people and in their day job they don't really have a lot of time to learn the tech skills are you know build NLP models and computational linguistics and stuff like that so but they're so keen to help people that they want to learn more questions again laudable thing is kind of quite exciting and quite scary measures Gary Gary is scary twice did you give an example of ways that it could be used for good and and and what you would fear that it might be used for for bad okay good I don't even want to give a specific example because I don't allow myself to think that far on this particular topic so the first step really is to understand what it's doing inside our brains and also on the outside our minds how we even think about the topic how we perceive the study of this area so yeah I think it's it's too tempting to want to say that it can be used as an intervention etc we have put in a grant to NASA to work with astronauts in in certain contexts but again when it comes to mental health it's so it's such a I don't know a protected space I just wouldn't want to get that far ahead because it could very well be a terrible thing so yeah I think an example of terrible is well you don't even have to have evidence that it's terrible but the perception of it being terrible and voices being embedded into sound you can't here's is terrifying to some people who have psychosis or have some sort of an experience with delusions or auditory hallucinations it could make people really paranoid about being in certain locations and brainwashing watermarking all of these types of things so I think that's what we're focusing on now really understanding those potential risks and how people perceive those and then just seeing getting some measures inside the brain to see what is it doing who is it who is impacting but there is also a lot of researcher research out there trying to do things like mice have ultra audible vocalizations when they're in distress babies infants can hear a little bit past our 20 kilohertz threshold so so there there are ways to explore there's some work in autism and things like that but me personally I think we just take it step by step and put ethics and perceptions of the have the psychological experience hello I wanted to ask with increasing amounts of data being gathered on people and companies sort of developing phenotypes for their users and trying to understand them a lot of it of course is relevant to people's mental health but as you give people access to this data and empower them to understand it and use it how can you avoid it sort of becoming a normative force and people sort of self-monitoring self-regulating in line with kind of normative values which maybe haven't come from the most diverse sort of backgrounds of representations how can you keep that diversity and a little bit of the quantified self or just being overly word so in mental health this is very dangerous and you could look at eating disorders or stepping on the scales and not being a number back in the day and now it's just an endless space where you can get numbers that could either reaffirm you or crush you and these things I think it's extremely dangerous so my take is is providing well obviously working with people and having options of what they could see but at the same time not necessarily showing everything about the analytics and maybe redesigning things so that it's not like based on popularity or the number of views or things like that I think just kind of cleaning some of that data away and making it more about the connections the relationship the people I would just say be careful with those those numbers in mental health because they really do make a difference on the flip side though and sometimes you can be using an app that will look at your side effects of your medication and that could be extremely useful or it could maybe stop you from doing something because you know the number so outside of mental health in diabetes and just knowing that okay maybe I shouldn't have this so they're like patches that you can put in your phone can just check what your levels are and knowing that might make you not make a decision and do something else so I think we'll see I'm not sure I don't have an answer but I think it has to be a careful consideration and very specific to the context of the tool being it's a great point this is just a question that sort of come up whilst watching your presentation so forgive me if it's not a little fully thought-out but I just wondered where you thought the balance was between sort of the sort of individual versus the population in the sense that if you're I mean it came up when you talked about FinTech stuff so looking at that data might be sort of a bit of an invasion of privacy for one individual person but it might then have sort of a greater impact of sort of positive impact if you're taking that as part of a larger data set if you see what I mean so where's the balance like creating filters so rather than knowing right drilling right down to what sport or a particular interest just active or not active or just having broader labels and then across the aggregated information I think is what you're trying to just say and I think yes I absolutely think that in it could be really good and I worry a little bit about clinical heterogeneity which just means that the more data you add the more not noise it's not the right word but different scenarios different things so you might lose your signal a little bit and an example of that is a data set in our imaging was sent around to I think 13 different UK sites and they effectively controlled for clinical heterogeneity by having the same data set all the different methods produce different results but layer that on top of them having different people with different life experiences culture views perceptions and think so you get more it's really tricky to go from one extreme to the other but I think it's really important to to try to work with population level in four and and just to make sure that you're not trying to say to an individual that because of this you fit into that so just transparency about causality and Association and just the individual not always having to be a part of that Association so yeah turn to educate and yeah tease these apart really important but causality is really tricky with mental health do we have more questions of silence just to process everything and no I mean because it's like it's really just information I think not not everybody is really aware of the complexity of the whole ecosystem of what's going on because we have digital government or we have sound we have different types of groups and then left kind of continue you know there's no pass to it and then new things are coming in so it's interesting to just take a minute and think about

Leave a Reply

Your email address will not be published. Required fields are marked *