I included this report in another post and the more I thought about it, the more I thought it should have it’s own post as it’s very important and relative. You hear me talk about my “coins” of “The Attack of the Killer Algorithms” and “Algo Duping” all the time, well the “scoring” is exactly that, with algorithms and math formulas that score you. Today when laws are created they are attached to some kind of technology which means automation will run the processes on servers 24/7. I don’t care how far they go and another abstract confirms my thoughts too that machines don’t and won’t have ethics. Sure they can respond a programmed which they learn by patterns but that’s as far as they go. The abstract is pretty interesting as a software engineers tries his best to substantiate doing it, but doesn’t work.
Limitations And Risks Of Machine Ethics (That Really Don’t Exist) - Abstract Basically Substantiating the Existence of What I Coined As “The Attacks of the Killer Algorithms”…
So when laws get passed, the machines go to work and then everyone falls apart when the machines do what they are programmed to do. We should get upset though as there’s a lot of flawed data and models that get through. The whole problem with Obamacare is a bunch of killer algorithms and some “scoring” that is not accurate taking place and everyone is in some kind of turmoil.
I don’t like to predict and don’t consider that a talent per se but here we go again, link below as I’m wired to visualize and when I learned how to code with data mechanics it became more pronounced so I talk as I see it and I can’t even explain it but it seems to roll the way I have been visualizing. I’m not the only one who does that but rather I blog it I guess. This first link is a post form 3 years ago where I said “data used out of context will be huge discriminatory practice against consumers” and again I saw that 3 years ago as it was already happening and really was the basis of the Occupy movement. Even the Occupy movement didn’t understand it but knew enough that something wasn’t right. They know now though and even wrote a book on alternative banking. There’s a video at the link below with Ford, NASA and others discussing how “what they don’t know about big data and using their internal quants”…
Big Data/Analytics If Used Out of context and Without True Values Stand To Be A Huge Discriminatory Practice Against Consumers–More Honest Data Scientists Needed to Formulate Accuracy/Value To Keep Algo Duping For Profit Out of the Game
Another post a few months after the one above, saying kind the same thing, lying with math models and the clash of the virtual and real world. Recently I decided to call this syndrome “The Grays”.
Hiding, Falsifying, And Accelerating Risk Has Become the Achilles Heel of the US Economy As the “Real” World” Clashes With the Values Created From a World of “Fictional Values” Of Formulas and Math
Here’s more from the archives, basically addressing the same thing and I have more in the archives too but this enough.
Credit Agencies Mining Social Networks To Verify Identities and Assess Consumer Credit Worthiness
Health Insurers and Others Trying to Track Junk Food Consumers Purchase–Attack of the Killer Algorithms for Corporate Profits
The Moocher Class–Jon Stewart Explains-Humorous Look at Models and Algorithms For Profit Affecting Consumers, Those Created By “Rich Data Modeling Moochers”
mHealth and Other Technologies in Healthcare Experience Slow Growth Is As the Data Selling Epidemic for Shear Profit in the US Continues to Grow Leaps and Bounds Leaving Manufacturing in the Hole and Non Competitive
Big Data–The Data Science Code of Ethics-Designed By Those Who Create Models - Don’t Fall Victim To Write Fictitious Code and Models Just to Make Money With Clients Demanding Such
Insurance Companies Are Buying Up Consumer Spending Data-Time is Here to License and Tax the Data Sellers-As Insurers Sell Tons of Data, Gets Flawed Data When Data Buyers Uses Out of context Too
Cathy O’Neil, Mathematician/Quant: Wall Street Quants The Culture, Big Data Mechanics, Algorithms, Data Mining, Lack of Privacy, Web Profiling, Health Insurance Profiles and Modeling Abuse…
Here’s the very important report and it’s done very fair and the rest of the world is watching the insanity of the “scoring of America”. They couldn’t have picked a better title either.
So all this scoring comes back around to another post, link below I made about 2 years ago too, that half of the analytics purchased will be a waste as some of the math formulas used will be based on “scoring” that is not accurate, had not been replicated or has proprietary code where we just sit there and take banks and companies for their word, while they pack the money away. Ever wonder why banks and corporations are so cash rich today…well thinks scoring and algorithms. Again some scoring is ok if if can be replicated, verified for accuracy and shared with the one who is “scored”as models and code “runs hog ass wild” if nobody’s watching the shop.
Half of Analytics Investments By Companies and Banks Will Be a Waste–What Do We Analyze with Big Data and Does It Have Value–Some Algo Fairies Would Do Better at Disneyland…
Everyone has all their privacy meetings and again keep in mind what this report says, so many “scores” are secret and they make profits as data get sold too and there’s absolutely nothing in the world that any lawmakers can do, short of stopping all data selling and we know that’s not going to happen, if we don’t have an index of ALL the data sellers and this could be done with a license as “scoring” and “data selling” go together hand and in hand for profit.
I can’t believe we have intelligent people that I guess succumb to lobbyists here and cannot see that you need to identify who are going to regulate and again that index would be a license requirement as well as a page on a website for every seller to list what kind of data they sell and to who. Data brokers would probably not like that but hey they sell our data that’s out there in public records and elsewhere, why should the person’s who’s data gets sold not know who bought it as well, as it’s our data and information all about us.
I wouldn’t mind that at all to see who all has purchased data about me. It would certainly slow down the automation a bit and would require the billion dollar businesses to clean up some of their data messes too. When you read the link below, one data broker got off the hook with a million dollar fine and got his find reduced to $60,000 because he didn’t have the money. What’s wrong with the judge who did that?
If that were a consumer with a million dollars worth of medical bills, they would go to bankruptcy court and create more credit problem data for the data sellers to to sell and get richer.
Two Data Brokers Get Fined by the FTC For Non Compliance, One Gets Most All Of The Million Dollar Fine Reduced as They Could Not Afford To Pay By the Court, But If You Are A Consumer With Medical Bills You Go To Bankruptcy Court..
This is also holding up a lot of innovation in healthcare too as everyone wants their mobile health apps and devices but not as long as there’s this cloudy privacy statement to read that you don’t understand and then your data get sold and you have no clue to who or where and people and companies can lie about it too as there’s no due diligence at all.
So the secondary problem here is the data sellers get richer as the scoring is one of the key elements that makes data worth more as something has assigned value to the queries that were created and everyone’s privacy is busted.
Also just for fun, the Privacy Forum has this interactive map on all the medical privacy breaches and this seems a like a good place for that that too:) I didn’t see on state untouched. BD
Brief Summary of Report
“This report highlights the unexpected problems that arise from new types of predictive consumer scoring, which this report terms consumer scoring. Largely unregulated either by the Fair Credit Reporting Act or the Equal Credit Opportunity Act, new consumer scores use thousands of pieces of information about consumers’ pasts to predict how they will behave in the future. Issues of secrecy, fairness of underlying factors, use of consumer information such as race and ethnicity in predictive scores, accuracy, and the uptake in both use and ubiquity of these scores are key areas of focus.
The report includes a roster of the types of consumer data used in predictive consumer scores today, as well as a roster of the consumer scores such as health risk scores, consumer prominence scores, identity and fraud scores, summarized credit statistics, among others. The report reviews the history of the credit score – which was secret for decades until legislation mandated consumer access — and urges close examination of new consumer scores for fairness and transparency in their factors, methods, and accessibility to consumers.”
0 comments :
Post a Comment