So it’s interesting to talk to somebody who comes from a world where we’re a little more familiar with that kind of transparency. And we build transparency around the edges of it. But what we do is we build accountability into it, right? And that’s being used to build electronic electric vehicle charging stations, affordable housing, parks, trees, and all these things to abate the, the impact of the environmental discrimination that these neighborhoods faced in the past. So just removing the race of the people involved, it doesn’t get it all the ways in discrimination creeps into society. I really appreciate how he articulates thethe dream of machine learning that we would get rid of bias and discrimination in official decisions. Vinhcent was very honest about with just in our infancy about learning how to make sure that we know how algorithms make the decision. Whether you have a squeaky-clean driving record or a few blemishes, Peachstate Insurance is here to make sure you find the right coverage for auto, motorcycle, and renters insurance at the right price.
Just give him a call; by this time you’ll be well on the road to finding a cheap auto insurance policy. Online tools will often provide ratings information as well. This way, you will lessen a big percentage of your premium. Though deductibles reduce your premium but in case of damage you may have to shell out a hefty amount from your own pocket. 5) Security: Fitting your automobile having an alarm, immobiliser, and other security devices can result in premium savings. We can use these services, right, this micro-targeting, let’s not use it to sell predatory ads, but let’s give these people that need it, like the government assistance program. And we don’t know why like that system works the way it does. Danny: Sometimes we talk about algorithms as though we’ve never encountered them in the world before, but in some ways, governance itself is this incredibly complicated system. Danny: Vinhcent Le thank you so much for coming and talking to us. Danny: Yea, I guess the lesson that, you know, a lot of people have learned in the last few years, and everyone else has kind of known is this sort of prejudice is, is wired in to so many systems. This c ontent has been creat ed by GSA C ontent Generator Demover si on .
Does your area have more higher unemployment rates? So we took all of those categories that banks are using to discriminate against people in loans, and we’re using those same categories to determine which areas of California get more access to a cap and trade reinvestment funds. As you know, California has cap and trade. So we have California has all these great government assistance programs that pay for your internet. Have it taken away to multiple community agents. And we got into a debate, you know a couple years back about how that money should be spent and what California did was create an algorithm with the input of a lot of community members that determined which cities and regions of California would get that funding. With years of Experience and a good driving record, why pay more for car insurance when you have earned the right to pay less. One way of actually doing this is to simply maintain good grades in school. They’re generally so devoted to why they’ve come to school that they rarely participate in dangerous behaviors. Because that’s why I got into this work.
So we know how the process at least is going to work. I was like, this is objective, like this is going to be data-driven things are going to be great. And then running dummy data through the systems to try to, to see what’s going on. Looking at the results across the board, not just about one person, but about a lot of people in order to try to see if there’s a disparate impact. Then looking at how the algorithms, what, what they’re putting out. What was interesting is that, you know, Vincent comes out of the world of home mortgages and banking and, other areas, and Greenlining itself, you know, who, who gets to buy, houses where, and at what terms, that has a lot of mechanisms already in place both to protect people’s privacy, but to have more transparency. We didn’t use any racial terms, but we used data sources that are associated with red lining.