Post by account_disabled on Mar 7, 2024 17:31:39 GMT 10
A few months ago, Apple announced that it would launch an innovative credit card designed to support the financial lives of its customers.
This is the Apple Card, a card that promises better manageability from your cell phone and offers an easy-to-understand view of expenses, a new level of privacy and security, and a percentage of cashback for each purchase made.
The Apple Card promises to be based on simplicity and transparency, it has no fees and this encourages customers to pay less interest.
In addition to this, the company has designed a Chile Mobile Number List titanium card to buy in places where Apple Pay is not accepted. This card has no number, security code, expiration date, or signature. All that information will be available in Wallet.
However, although it sounds like an excellent option, the Apple company is being investigated by the New York Department of Financial Services.
Sexism detected in Apple Card algorithm
A few days ago, David Heinemeir Hansson, an influential Danish developer, declared on Twitter that the Apple Card tool is "fucking sexist."
«My wife and I file joint taxes, we live in a shared property and we have been married for a long time. However, Apple's algorithms believe that I deserve a credit limit 20 times greater than her.
For his part, Steve Wozniak, co-founder of Apple, said the same thing happens to his wife and claims that the algorithms they use to set limits could be biased against women.
"The same thing happened to us," he said in response to that tweet. «I obtained up to 10 times more credit limit. "We have no accounts, no credit cards, no separate assets."
This concern has led to an investigation that seeks to bring to light the practices in granting these Goldman Sachs credit cards.
“New York law prohibits discrimination against protected classes of individuals,” Linda Lacewell, the superintendent of the New York State Department of Financial Services, wrote in a blog post.
This prevents there from being an algorithm, like any other method to determine credit worthiness, from disparate treatment based on individual characteristics such as age, belief, race, color, sex, sexual orientation, nationality, among others. We know that the issue of discrimination in algorithmic decision-making also extends to other areas of financial services.
For its part, the company, which is one of the largest investment groups on the planet, published a press release in which it stated the following:
We do not and will not make decisions based on factors such as gender.
Not the first time…
US healthcare giant UnitedHealth Group is being investigated by DFS over complaints about an algorithm that was suspected of favoring white patients over black patients.
From what the investment bank told the specialized media Bloomberg : "our credit decisions are based on a client's creditworthiness, and not on factors such as gender, race, age, sexual orientation or any other aspect prohibited by law." .
So far, the company has not released its official position on this controversy.
This is the Apple Card, a card that promises better manageability from your cell phone and offers an easy-to-understand view of expenses, a new level of privacy and security, and a percentage of cashback for each purchase made.
The Apple Card promises to be based on simplicity and transparency, it has no fees and this encourages customers to pay less interest.
In addition to this, the company has designed a Chile Mobile Number List titanium card to buy in places where Apple Pay is not accepted. This card has no number, security code, expiration date, or signature. All that information will be available in Wallet.
However, although it sounds like an excellent option, the Apple company is being investigated by the New York Department of Financial Services.
Sexism detected in Apple Card algorithm
A few days ago, David Heinemeir Hansson, an influential Danish developer, declared on Twitter that the Apple Card tool is "fucking sexist."
«My wife and I file joint taxes, we live in a shared property and we have been married for a long time. However, Apple's algorithms believe that I deserve a credit limit 20 times greater than her.
For his part, Steve Wozniak, co-founder of Apple, said the same thing happens to his wife and claims that the algorithms they use to set limits could be biased against women.
"The same thing happened to us," he said in response to that tweet. «I obtained up to 10 times more credit limit. "We have no accounts, no credit cards, no separate assets."
This concern has led to an investigation that seeks to bring to light the practices in granting these Goldman Sachs credit cards.
“New York law prohibits discrimination against protected classes of individuals,” Linda Lacewell, the superintendent of the New York State Department of Financial Services, wrote in a blog post.
This prevents there from being an algorithm, like any other method to determine credit worthiness, from disparate treatment based on individual characteristics such as age, belief, race, color, sex, sexual orientation, nationality, among others. We know that the issue of discrimination in algorithmic decision-making also extends to other areas of financial services.
For its part, the company, which is one of the largest investment groups on the planet, published a press release in which it stated the following:
We do not and will not make decisions based on factors such as gender.
Not the first time…
US healthcare giant UnitedHealth Group is being investigated by DFS over complaints about an algorithm that was suspected of favoring white patients over black patients.
From what the investment bank told the specialized media Bloomberg : "our credit decisions are based on a client's creditworthiness, and not on factors such as gender, race, age, sexual orientation or any other aspect prohibited by law." .
So far, the company has not released its official position on this controversy.