Rechercher dans ce blog

Thursday, September 30, 2021

EA promotes Laura Miele to COO, making her one of the most powerful women in gaming - The Verge

Electronic Arts is promoting chief studios officer Laura Miele to chief operating officer, the company announced Thursday. The change is a big promotion for Miele, who already had significant leadership at the company overseeing 25 different studios. The new role will give Miele greater oversight over the company and arguably makes her the most powerful woman in gaming, an industry where there are few female executives, fewer in the C-suite, and where those C-suite execs are often in charge of HR or finance rather than the company’s products.

Ubisoft did make Virginie Hass its chief studios operating officer last August, following scandals over a toxic culture including sexual harassment and misconduct that went as high as the C-suite ranks.

Miele joined EA in 1996 and has served as chief studios officer since April 2018. The Verge spoke with Miele in July, where she discussed how the pandemic changed development at EA. Miele will move into the role over the next few months, according to an SEC filing (PDF).

EA also announced that chief financial officer Blake Jorgensen will be leaving the company. He’s expected to depart in 2022, and a search to replace him “will begin immediately.” Chris Bruzzo, who was previously the company’s executive vice president of marketing, commercial, and positive play, will become the company’s chief experience officer.

Adblock test (Why?)



Technology - Latest - Google News
October 01, 2021 at 08:18AM
https://ift.tt/3a0paJ7

EA promotes Laura Miele to COO, making her one of the most powerful women in gaming - The Verge
Technology - Latest - Google News
https://ift.tt/2AaD5dD

Gloria Estefan reveals she was sexually abused when she was 9 - NBC News

NEW YORK — Gloria Estefan has revealed that, at the age of 9, she was sexually abused by someone her mother trusted.

The Cuban-American superstar spoke for the first time publicly about the abuse and its effects on her during an episode of the Facebook Watch show “Red Table Talk: The Estefans” that aired Thursday.

“He was family, but not close family. He was in a position of power because my mother had put me in his music school and he immediately started telling her how talented I was and how I needed special attention, and she felt lucky that he was focusing this kind of attention on me,” the singer said.

Estefan, who was born in Cuba and moved to Miami with her family when she was a toddler, revealed the abuse at the top of the show, which featured Clare Crawley, the first Latina “Bachelorette.” On the episode, called “Betrayed by Trusted Adults,” Crawley talked about child abuse she experienced at the hands of a priest.

The Associated Press does not typically identify victims of sexual abuse unless they agree to be named or share their stories publicly.

Sitting at the round red table with her co-hosts — daughter Emily Estefan and niece Lili Estefan — Estefan opened by saying that “93 percent of abused children know and trust their abusers, and I know this, because I was one of them.”

Co-hosts, from left, Lili Estefan, Gloria Estefan and Emily Estefan with guest Clare Crawley during a taping of "Red Table Talk: The Estefans."Facebook Watch via AP

“You’ve waited for this moment a long time,” her niece told her.

“I have,” Estefan replied.

The three held hands with teary eyes.

She did not name her abuser but described how she tried to stop him. She said the abuse started little by little before moving fast, and that she knew that she was in a dangerous situation after confronting him.

“I told him, ‘This cannot happen, you cannot do this.’ He goes: ‘Your father’s in Vietnam, your mother’s alone and I will kill her if you tell her,’” Estefan said. “And I knew it was crazy, because at no point did I ever think that it was because of me that this was happening. I knew the man was insane and that’s why I thought he might actually hurt my mother.”

Estefan said she started making up excuses to avoid going to music lessons. Her daughter Emily asked if her grandmother had any inkling something was going on. People didn’t talk about those things back then, Estefan replied.

She tried to reach her dad, with whom she exchanged voice tapes while he was posted in Vietnam.

Recordings in Spanish from when Estefan was 9 were played at the show with English subtitles:

Gloria: “I’m taking guitar lessons. I like them but the exercises are a little hard.”

Her dad: “Mommy told me that the owner of the academy where you’re taking your guitar lessons is very proud of you.”

Gloria: “I like the notes, but it’s a little boring to study the notes”.

Her dad: “Mommy tells me that he said that you are a born artist.”

Estefan said the level of anxiety made her lose a “circle of hair.”

“I couldn’t take it anymore,” she said, so one night she ran to her mother’s bedroom at 3 a.m. and told her what was going on.

Her mother called the police, but the officers advised her not to press charges because the trauma of testifying would be too harmful.

Both Crawley and Estefan said during the show that they didn’t like to be called victims. Crawley called herself a survivor.

Estefan said she didn’t tell the producers she was going to reveal her story on Thursday’s episode. No one knew about the abuse except for her family, said the singer, who has been married to music producer Emilio Estefan for over four decades.

She also said that, when her mother started inquiring about this man within the family, an aunt shared that he had abused her years back in Cuba.

The Associated Press asked the show’s publicist if Estefan could answer some questions, including if the man was still alive. The publicist told the AP that she would not make further comments.

On “Red Table Talk,” Estefan recalled almost going public in the mid-80s, when her hit “Conga” with the Miami Sound Machine was at the top of the Billboard charts and “this predator, who was a respected member of the community,” had the audacity to write a letter to a paper criticizing her music.

“At that moment, I was so angry that I was about to blow the lid off of everything,” she said. “And then I thought: ’My whole success is gonna turn into him!

“It’s manipulation and control, but that’s what they do, they take your power,” she added, also admitting the fear that there could be other victims makes her feel bad.

After introducing Crawley and telling her that she didn’t want to sit quietly while she shared her story, Estefan said she had been waiting for the right opportunity and space to tell hers.

The show was that space.

“This is one of the reasons why I said yes to the ‘(Red) Table (Talk)’ at all, because we wanted to create this space where we talk about important things that hopefully will make a difference to everybody that’s watching out there.”

Follow NBC Latino on FacebookTwitter and Instagram.

Adblock test (Why?)



Entertainment - Latest - Google News
October 01, 2021 at 03:16AM
https://ift.tt/3kSCqWB

Gloria Estefan reveals she was sexually abused when she was 9 - NBC News
Entertainment - Latest - Google News
https://ift.tt/2RiDqlG

Scarlett Johansson, Disney Settle Explosive ‘Black Widow’ Lawsuit - Hollywood Reporter

Scarlett Johansson and Disney have settled a breach of contract lawsuit over the star’s Black Widow payday, The Hollywood Reporter has learned. Terms of the deal were not disclosed.

“I am happy to have resolved our differences with Disney,” stated Johansson. “I’m incredibly proud of the work we’ve done together over the years and have greatly enjoyed my creative relationship with the team. I look forward to continuing our collaboration in years to come.”

Disney Studios chairman Alan Bergman added: “I’m very pleased that we have been able to come to a mutual agreement with Scarlett Johansson regarding Black Widow. We appreciate her contributions to the Marvel Cinematic Universe and look forward to working together on a number of upcoming projects, including Disney’s Tower of Terror.”

Related Stories

The explosive suit, filed by the actress in July in Los Angeles Superior Court, claimed that the studio sacrificed the film’s box office potential in order to grow its fledgling Disney+ streaming service. Disney countered that Johansson was paid $20 million for the film.

The settlement brings to a close a back-and-forth PR battle that pitted the CAA-repped star against Disney and was poised to have dramatic implications for all of Hollywood’s major studios. Johansson’s cause received support in the industry, with talent and executives — including Jamie Lee Curtis, Marvel’s WandaVision star Elizabeth Olsen and mogul Jason Blum — speaking out on her behalf.

At the time of the complaint, a Disney spokesperson said, in part, “The lawsuit is especially sad and distressing in its callous disregard for the horrific and prolonged global effects of the COVID-19 pandemic.” CAA co-chairman Bryan Lourd shot back that Disney “shamelessly and falsely accused Ms. Johansson of being insensitive to the global COVID pandemic, in an attempt to make her appear to be someone they and I know she isn’t.”

In her complaint, Johansson said the Marvel tentpole had been guaranteed an exclusive theatrical release when she signed her deal. She alleged that her contract was breached when the film was simultaneously released on Disney+.

As the coronavirus pandemic wreaked havoc on Hollywood over the past 18 months, Black Widow was one of many big-budget movies, also including Warner Bros.’ Wonder Woman 1984 and Disney’s Cruella and Jungle Cruise, that bowed simultaneously on streaming and in theaters. But to date, Johansson is the only major movie star to sue.

“Why would Disney forgo hundreds of millions of dollars in box office receipts by releasing the Picture in theatres at a time when it knew the theatrical market was ‘weak,’ rather than waiting a few months for that market to recover?” the complaint asked. “On information and belief, the decision to do so was made at least in part because Disney saw the opportunity to promote its flagship subscription service using the Picture and Ms. Johansson, thereby attracting new paying monthly subscribers, retaining existing ones, and establishing Disney+ as a must-have service in an increasingly competitive marketplace.”

Black Widow, which has earned $379 million at the worldwide box office to date, debuted at the same time in theaters and on Disney+ Premier Access for an additional $30. But in what was viewed by rival studio executives as a major miscalculation, Disney boasted on July 11 that Black Widow earned $60 million via Disney+ Premier Access, opening the door for a fierce clash. After all, Johansson had been considering litigation for several months, says a source familiar with the suit. Until the afternoon of July 28, she believed Disney would make an offer and that she wouldn’t have to file a suit. But Disney stayed in the mode of, “Let’s keep talking,” the source adds. Johansson was particularly incensed by the announcement, which pleased Wall Street but not the talent and representation community.

According to the complaint, Disney’s move “not only increased the value of Disney+, but it also intentionally saved Marvel (and thereby itself) what Marvel itself referred to as ‘very large box office bonuses’ that Marvel otherwise would have been obligated to pay Ms. Johansson.”

Johansson vs. Disney marked the latest iteration of a profit-participation dispute that is all too common in Hollywood, with actors fighting studios over their backend compensation or the definition of “net profit.” Very few of these battles percolate to the surface; they often come to a resolution before lawyers get involved, or the actor’s contract contains an arbitration provision and the whole process remains confidential. (A source familiar with Johansson’s suit says her contract does have an arbitration provision, but her lawyers were willing to test it.)

“The exception is when there’s so much money involved or if there’s a level of acrimony that has reached a point of no return, and people are going to stand on principle,” attorney James Sammataro tells THR. “That statement by Disney confirmed the latter, but it still is a shocking statement to make — to paint someone as being insensitive and playing the whole, ‘You’re so out of touch’ card. You could probably make the same argument about Disney; ‘Yeah. You’ve been generating millions, if not billions, during the pandemic.’”

In the wake of Johansson’s suit, more than a handful of other A-listers were said to be considering filing similar suits. (Jungle Cruise star Dwayne Johnson was not one of them, given that he has a different compensation structure than Johansson.) But that has not come to fruition yet. Cruella’s Emma Stone closed a deal two weeks after Johansson’s suit to star in a sequel of Disney’s live-action film, offering a sign that Disney was working to secure and mollify talent amid the charged atmosphere.

While Disney has faced criticism for its handling of talent deals during the pandemic, WarnerMedia took a different approach by proactively doling out as much as $200 million to pay a long list of stars whose Warner Bros. films were simultaneously opening in theaters and on its HBO Max streaming service, including Patty Jenkins, Gal Gadot and Will Smith.

Johansson is represented by Kasowitz partner John Berlinski, while Daniel Petrocelli has been repping Disney.

Adblock test (Why?)



Entertainment - Latest - Google News
October 01, 2021 at 06:52AM
https://ift.tt/3B0G7z0

Scarlett Johansson, Disney Settle Explosive ‘Black Widow’ Lawsuit - Hollywood Reporter
Entertainment - Latest - Google News
https://ift.tt/2RiDqlG

Super Bowl halftime show to feature Dr. Dre, Snoop Dogg, Eminem, Mary J. Blige, Kendrick Lamar - ESPN

LOS ANGELES -- Dr. Dre, Snoop Dogg, Eminem, Mary J. Blige and Kendrick Lamar will perform for the first time onstage together at the Pepsi Super Bowl Halftime Show.

The NFL, Pepsi and Roc Nation announced Thursday that the five music icons will perform on Feb. 13 at SoFi Stadium in Inglewood, California. Dre, Snoop Dogg and Lamar are Southern California natives.

Dre emerged from the West Coast gangster rap scene alongside Eazy-E and Ice Cube to help form the group N.W.A., which made a major mark in the hip-hop culture and music industry with controversial lyrics in the late 1980s. Dre is responsible for bringing forth rap stars such as Snoop Dogg, Eminem, 50 Cent and Lamar. Dre also produced Blige's No. 1 hit song "Family Affair."

"The opportunity to perform at the Super Bowl Halftime show, and to do it in my own backyard, will be one of the biggest thrills of my career," Dre said in a statement. The seven-time Grammy winner added that their halftime performance will be an "unforgettable cultural moment."

The Super Bowl returns to the Los Angeles area for the first time since 1993. It's the third year of collaboration between the NFL, Pepsi and Roc Nation.

Roc Nation and Emmy-nominated producer Jesse Collins will serve as co-producers of the halftime show. The game and halftime show will air live on NBC.

The five music artists have a combined 44 Grammys. Eminem has the most with 15.

Roc Nation founder Jay-Z said in a statement that their show will be "history in the making."

Dre, Snoop Dogg, Eminem, Blige and Lamar join a list of celebrated musicians who have played during Super Bowl halftime shows, including Beyonce, Madonna, Coldplay, Katy Perry, U2, Lady Gaga, Michael Jackson, Jennifer Lopez, Shakira and most recently The Weeknd.

NFL and Pepsi will join together to support the launch of Regional School #1, a magnet high school in South Los Angeles. It's set to open for students next fall as part of the Los Angeles Unified School District.

The high school is based on the USC Iovine and Young Academy, a program founded by Jimmy Iovine and Andre "Dr. Dre" Young. It will offer an educational model focused on the theme of integrated design, technology and entrepreneurship.

"This effort will help develop and inspire the next generation of entrepreneurs and innovators," said Megan K. Reilly, the L.A. Unified interim superintendent. "We are excited about the additional opportunities this partnership will bring to our students."

Adblock test (Why?)



Entertainment - Latest - Google News
October 01, 2021 at 04:43AM
https://ift.tt/3F6zZbe

Super Bowl halftime show to feature Dr. Dre, Snoop Dogg, Eminem, Mary J. Blige, Kendrick Lamar - ESPN
Entertainment - Latest - Google News
https://ift.tt/2RiDqlG

Amazon Basics ripped off accessories, now Amazon is coming for Fitbit, Ecobee, and more - The Verge

In the image above there are two wearables. One is Fitbit’s recently released Charge 5, a $179.95 fitness tracker designed to measure everything from your heart rate to your sleep and even your skin temperature. The other is Amazon’s new $79.99 Halo View fitness tracker, which Amazon says can measure everything from your heart rate to your sleep and skin temperature. Ten points if you can tell me which is which.

The Halo View was just one of a host of new devices announced by Amazon at its now annual fall hardware event this week. But while many of Amazon’s new products feature completely original designs and features, like its cute Astro “home robot” or Ring-branded home surveillance drone, there were a handful that bear a striking resemblance to preexisting products.

Take Amazon’s new $59.99 smart thermostat, which works with Amazon’s voice assistant Alexa, and promises to detect when you’re home and adjust the temperature accordingly. That’s very similar to what other Alexa-enabled thermostats like the $250 Ecobee smart thermostat offer, but at a fraction of the price. Not to mention Amazon’s design is also similar to a preexisting smart thermostat produced by a company called Tado (which itself retails for the equivalent of around $240).

Not every smart thermostat needs to look completely original or have a unique set of features (after all, there’s only so much a thermostat can do). But the announcement of Amazon’s own Smart Thermostat comes just months after The Wall Street Journal reported that Ecobee had been nervous about sharing additional data with Amazon, in part due to fears that this data could help it launch competing products, as concerns that it could harm consumer privacy. Ecobee was reportedly told that failing to provide this data, which would send information to Amazon about the device’s status even when a customer wasn’t using it, could risk the company losing its Alexa certification for future models or not be featured in Prime Day sales.

In response to The Verge’s enquiry, Amazon said it hadn’t used data from any other Alexa-connected thermostats to design its Smart Thermostat. It said the device had been co-created with Resideo, a company that has also worked on Honeywell’s Home thermostats, and that Ecobee continues to be one of its valued partners.

Meanwhile, Amazon’s new Halo membership features look like obvious competitors to MyFitnessPal. Halo Nutrition is designed to help users find recipes and cook food that fits their dietary needs, similar to the Meal Plans feature MyFitnessPal already includes as part of its premium plan. There’s also Halo Fitness’ guided workouts, similar to the self-guided routines from MyFitnessPal. But Amazon’s health subscription is a lot cheaper than MyFitnessPal’s premium tier. Amazon includes a free year of its Halo membership services with the purchase of its new Halo View fitness tracker, and it retails for $3.99 / month thereafter, which is less than half the price of MyFitnessPal’s $9.99 premium tier.

When we asked Amazon about these similarities, it said it had not copied other companies, and that its Halo service includes unique features not available with other fitness trackers.

Responding to questions from The Verge, Amazon said it’s “pioneered hundreds of features, products, and even entirely new categories” throughout its history. “Amazon’s ideas are our own,” the company said, citing products such as the Kindle, Amazon Echo, and Fire TV as prominent examples of its original inventions.

I’m not trying to claim that Amazon is breaking any rules with these products. There are only so many ways you can design a screen that straps to your wrist and shows you your heart rate — even if this one has clear Fitbit vibes — or a panel that attaches on your wall to control the temperature. Even complicated devices like smartphones have seen their designs gradually converge over the years, a process not helped by the fact that many manufacturers are using components supplied by the same small handful of companies.

But the similarities look cynical coming from Amazon, which has been criticized for ripping off the designs of products sold on its platform and then undercutting them on price. Earlier this year, bag and accessory manufacturer Peak Design drew attention to the startling similarity between its $99.95 Everyday Sling and Amazon Basics’ $32.99 Camera Bag, for example. Amazon’s version of the bag has since been discontinued, the company told The Verge.

Amazon’s cloning of the Peak Design bag wasn’t an isolated incident, either. In 2019, striking similarities were also pointed out between the $45 shoes produced by Amazon’s 206 Collective label and Allbirds’ $95 equivalent. The similarities prompted Allbirds CEO Joey Zwillinger to respond in a Medium post saying that he was “flattered at the similarities that your private label shoe shares with ours,” but politely asked that Amazon also “steal our approach to sustainability” and use similarly renewable materials. Amazon said that the shoe has also since been discontinued, but that it continues to offer similar styles. Amazon also said its original design did not infringe on Allbirds’ design, and that the aesthetic is common across the industry.

Or what about the Amazon Basics Laptop Stand that Bloomberg reported on in 2016, which was launched at around half the price of Rain Design’s (at the time) bestselling $43 model? Harvey Tai, Rain Design’s general manager, said that the company’s sales had slipped since Amazon’s competing model appeared on the store, although he admitted that “there’s nothing we can do because they didn’t violate the patent.”

Copying and attempting to undercut dominant market players is nothing new. But Amazon is in a fairly unique position in that it’s not just competing with these products; in a lot of cases, it’s also selling them via its own platform. That theoretically gives it access to a goldmine of data which could be invaluable if it wanted to launch a competitor of its own.

Amazon was accused of doing exactly this in a Wall Street Journal investigation last year, which alleged that Amazon’s employees “have used data about independent sellers on the company’s platform to develop competing products.” The WSJ specifically cited an instance where an unnamed Amazon private-label employee accessed detailed sales data about a car-trunk organizer from a company called Fortem launched in 2016. In 2019, Amazon launched three similar competitors under its Amazon Basics label.

The same report also detailed an instance of employees accessing sales data for a popular office-chair seat cushion from Upper Echelon Products, before Amazon launched its own competitor.

Amazon tells The Verge that an internal investigation conducted following the publication of the WSJ’s report found no violations of its policies prohibiting the use of non-public individual seller data.

Whether or not Amazon’s employees are breaking its rules, regulators have taken notice. Last year the EU accused Amazon of using “non-public seller data” to inform Amazon’s own retail offers and business decisions. “Data on the activity of third party sellers should not be used to the benefit of Amazon when it acts as a competitor to these sellers,” the European Commission’s antitrust tzar, Margrethe Vestager, said at the time. The Commission has yet to issue a final report or findings, and in a statement Amazon said it disagreed with its accusations.

For its part, Amazon says it has a policy of preventing its employees from using “nonpublic, seller-specific data to determine which private label products to launch.” But the company’s founder and former CEO Jeff Bezos told lawmakers last year that he couldn’t guarantee the policy has never been violated, and sources interviewed by The Wall Street Journal said that employees found ways around these rules.

These accusations have so far centered around low-tech items like bags, shoes, and trunk organizers. But as Amazon has expanded into more areas of consumer tech, its designs are once again straying very close to the competition.

Given these concerns, it seems especially bizarre that Amazon was willing to reference the price of competing smart thermostats sold via its platform during the launch. Its smart thermostat is “less than half the average cost of a smart thermostat sold on Amazon.com,” the company’s senior vice president of devices and services, Dave Limp, said.

Once again, I need to stress that there’s nothing illegal (to my knowledge) about using public information like pricing in order to inform your own products. But taking a moment to specifically reference the pricing of competing devices sold on your own monolithic online store is an odd choice in the midst of all this scrutiny.

It will be impossible to know how closely the functionality of each of Amazon’s new products are to their competitors until we’ve tried them for ourselves. But given Amazon’s size and market power, these kinds of awkward questions need to be asked. After all, Amazon is treading an awkward line between operating one of the largest sales platforms in the world, and competing within them as an increasingly prolific consumer tech manufacturer. It’s a difficult balance, and regulators are watching.

Adblock test (Why?)



Technology - Latest - Google News
October 01, 2021 at 12:35AM
https://ift.tt/3uqjSQA

Amazon Basics ripped off accessories, now Amazon is coming for Fitbit, Ecobee, and more - The Verge
Technology - Latest - Google News
https://ift.tt/2AaD5dD

Can you use MacBook Pro chargers for iPhone and iPad fast charging? - 9to5Mac

Recommendations to fast charge iPhone or iPad often include picking up the 20W power adapter from Apple or similar from a third party. But what if you already have a higher-powered USB-C charger from your MacBook Pro or MacBook Air? Follow along for which iPhones and iPads you can fast charge with Apple’s MacBook chargers or similar third-party chargers.

Update 9/30: With the iPhone 13 Pro Max being able to pull up to 27W of power, using 30W+ power adapters will give you the fastest charging times. It’s unclear for now if the whole iPhone 13 lineup can pull up to the 27W max, but as detailed below, it doesn’t hurt to use a higher powered wall plug as the iPhone is what determines the power it gets.

If you want something with more ports than your MacBook charger, two of the best options are Satechi’s compact 4-port 66W GaN USB-C Charger and Anker’s 36W dual-port USB-C charger.

Apple used to ship an 18W USB-C power adapter with the iPhone 11 Pro models and the 5W adapter with older iPhones. However, starting in fall 2020 with the iPhone 12/12 Pro launch, Apple stopped included a power adapter in the box with all new iPhones.

Fast charging offers around 50% battery in 30 minutes. But picking up a new USB-C to Lightning cable and 20W charging block from Apple costs $40 if you need both. Third-party options often cost less, but what about using something you already have?

The good news is that modern iPhones and iPads work with all of the Mac notebook USB-C chargers, even the 96W model that comes with the 16-inch MacBook Pro. While it may sound risky at first, it’s safe to use any of Apple’s USB-C chargers, as your iPhone or iPad is what determines the power it receives, not the charger. Apple even does its own testing with the whole range of its USB-C power adapters.

Fast charge iPhone iPad with MacBook charger?

Note: depending on the current capacity of your battery, your device will pull different levels of power. For example, a battery at 10% will draw more power than one at 80%.

Fast charge iPhone and iPad with MacBook chargers?

Apple says the following iOS devices are compatible with the 18W, 20W, 29W, 30W, 61W, 87W, and 96W adapters for fast charging:

  • iPhone 8/8 Plus and later
  • iPad Pro 12.9-inch (1st generation and later)
  • iPad Pro 11-inch (1st generation and later)
  • iPad Pro 10.5-inch
  • iPad Air 3rd generation and later
  • iPad mini 5th generation and later

Apple notes you can use its USB-C to Lightning cable or that “a comparable third-party USB-C power adapter that supports USB Power Delivery (USB-PD)” will also work like Anker’s Powerline series.

If you’re looking for a more flexible USB-C charger or want an extra, Anker’s 36W dual-port USB-C charger and Satechi’s 4-port 66W GaN USB-C Charger are great choices to fast charge iPhones and iPads simultaneously.

Read more 9to5Mac tutorials:

Check out 9to5Mac on YouTube for more Apple news:

Adblock test (Why?)



Technology - Latest - Google News
October 01, 2021 at 01:04AM
https://ift.tt/2ZG6Bbn

Can you use MacBook Pro chargers for iPhone and iPad fast charging? - 9to5Mac
Technology - Latest - Google News
https://ift.tt/2AaD5dD

Google tells EU court it’s the #1 search query on Bing - Ars Technica

Let's see, you landed on my "Google Ads" space, and with three houses, that will be $1,400.
Enlarge / Let's see, you landed on my "Google Ads" space, and with three houses, that will be $1,400.
Ron Amadeo / Hasbro

Google is in the middle of one of its many battles with EU antitrust regulators—this time it's hoping to overturn the record $5 billion fine the European Commission levied against it in 2018. The fine was for unfairly pushing Google search on phones running Android software, and Google's appeal argument is that search bundling isn't the reason it is dominating the search market—Google Search is just so darn good.

Bloomberg reports on Google's latest line of arguments, with Alphabet lawyer Alfonso Lamadrid telling the court, “People use Google because they choose to, not because they are forced to. Google’s market share in general search is consistent with consumer surveys showing that 95% of users prefer Google to rival search engines.”

Lamadrid then went on to drop an incredible burn on the #2 search engine, Microsoft's Bing: “We have submitted evidence showing that the most common search query on Bing is, by far, 'Google.'"

Worldwide, Statcounter has Google's search engine marketshare at 92 percent, while Bing is a distant, distant second at 2.48 percent. Bing is the default search engine on most Microsoft products, like the Edge browser and Windows, so quite a few people end up there as the path of least resistance. Despite being the default, Google argues that people can't leave Bing fast enough and that they do a navigational query for "Google" to break free of Microsoft's ecosystem.

Google's argument that defaults don't matter runs counter to the company's other operations. Google pays Apple billions of dollars every year to remain the default search on iOS, which is an awfully generous thing to do if search defaults don't matter. Current estimates put Google's payments to Apple at $15 billion per year. Google also pays around $400 million a year to Chrome rival Mozilla to remain the default search on Firefox.

Adblock test (Why?)



Technology - Latest - Google News
September 30, 2021 at 11:05PM
https://ift.tt/39QqMFx

Google tells EU court it’s the #1 search query on Bing - Ars Technica
Technology - Latest - Google News
https://ift.tt/2AaD5dD

An Interview with Intel Lab's Mike Davies: The Next Generation of Neuromorphic Research - AnandTech

As part of the launch of the new Loihi 2 chip, built on a pre-production version of Intel’s 4 process node, the Intel Labs team behind its Neuromorphic efforts reached out for a chance to speak to Mike Davies, the Director of the project. Now it is perhaps no shock that Intel’s neuromorphic efforts have been on my radar for a number of years – as a new paradigm of computing compared to the traditional von Neumann architecture, and one that is meant to mimic brains and take advantages of such designs, if it works well then it has the potential to shake up specific areas of the industry, as well as Intel’s bottom line. Also, given that we’ve never really covered Neuromorphic computing in any serious detail here on AnandTech, it would be a great opportunity to get details on this area of research, as well as the newest hardware, direct from the source.

Mike Davies currently sits as Director of Intel’s Neuromorphic Computing Lab, a position held since 2017, as well as having been a principle engineer on the same project. Mike joined Intel in 2011 as part of the acquisition of Fulcrum Microsystems, where he had been in IC development for 11 years. Fulcrum’s focus was on asynchronous network switch design, and after Intel made the acquisition, that technology eventually made its way into Intel’s networking division, and so the asynchronous compute team pivoted to Neuromorphic designs. Mike has been the face of Intel’s Neuromorphic efforts, demonstrating the technology and the extent of the research and collaborations with industry partners and academic institutions at industry events.


Mike Davies
Director, Intel Labs

Dr. Ian Cutress
AnandTech

Ian Cutress: Can you describe what Neuromorphic Computing is, and what it means for Intel?

Mike Davies: Neuromorphic Computing is a rethinking of computer architecture, inspired by the principles of brains. It is really informed at a very low level of our understanding of neuroscience, and  it leads us to an architecture that looks dramatically different from even the latest AI accelerators or deep learning accelerators.

It is a fully integrated memory and compute model, so you have computing elements sitting very close to the storage state elements that correspond to the neural state and the synaptic state that represents the network that you're computing. It’s not [a traditional] kind of streaming data model always executing through off chip memory - the data is staying locally, not moving around, until there's something important to be computed. [At that point] the local circuit activates and sends an event based message, or a spike, to all the other neurons that are paying attention to it.

Probably the most fundamental difference to conventional architectures is that the computing process is kind of an emergent phenomenon. All of these neurons can be configured, and they operate as a dynamic system, which means that they evolve over time – and you may not know the precise sequence of instructions or states that they step through to arrive at the solution as you do in a conventional model. It's a dynamic process. You proceed through some collective interaction, and then settle into some new equilibrium state, which is the solution that you're looking for.

So in some ways it has parallels to quantum computing which is also computing with physical interactions between its elements. But here we are dealing with digital circuits, still designed in a pretty traditional way with traditional process technology, but the way we've constructed those circuits, and the architecture overall, is very different from conventional processors.

As far as Intel's outlook, we're hoping that through this research programme, we can uncover a new technology that augments our portfolio of current processors, tools, techniques, and technologies that we have available to us to go and address a wide range of different workloads. This is for applications where we want to deploy really adaptive and intelligent behavior. You can think of anything that moves, or anything that's out in the real world, faces power constraints and latency constraints, and whatever compute is there has to deal with the unpredictability and the variability of the real world. [The compute] has to able to make those adjustments, and respond to data in real time, in a very fast but low power mode of operation.

IC: Neuromorphic computing has been part of Intel Labs for almost a decade now, and it remains that way even with the introduction of Loihi 2, with external collaborations involving research institutions and universities. Is the roadmap defining the path to commercialization, or is it the direction and learnings from the collaborations that are defining the roadmap?

MD: It's an iterative process, so it's a little bit of both!

But first, I need to correct something - the acquisition I was a part of with Intel, 10 years ago, actually had nothing to do with neuromorphic computing at all. That was actually about Ethernet switches of all things! So our background was coming from the standpoint of moving data around in switches, and that's gone on to be commercialized technology inside other business groups at Intel. But we forked off and used the same kind of fundamental asynchronous design style that we had in those chips, and then we applied it to this new domain. That started about six years ago or so.

But in any case, what you're describing [on roadmaps] is really a little bit of both. We don't have a defined roadmap, given that this is about as basic of research as Intel engages in. This means that we have a kind of vision for where we want to end up – we want to bring some differentiating technologies to this domain.

So in this asynchronous design methodology, we did the best we could at Intel in developing an architecture for a chip with the best methods that we had available. But that was about as far as we could take it, as just one company operating in isolation. So that's why we released Loihi out to an ecosystem, and it's been steadily growing. We're seeing where this architecture performs really well on real workloads with collaborators, and where it doesn't perform well. There has been surprises in both of those categories! So based on what we learn, we're advancing the architecture, and that is what has led us to this next generation.

So while we're also looking for possible near term applications, which may be specializations of this general purpose design that we're developing, long term we might be able to incorporate designs into our mainstream products hidden away, in ways that maybe a user or a programmer wouldn't have to worry that they are present in the chip.

IC: Are you expecting institutions with Loihi v1 installed to move to Loihi v2, or does v2 expand the scope of potential relationships?

MD: In pretty much all respects, Loihi 2 is superior to Loihi v1. I expect that pretty quickly these groups are going to transition to Loihi 2 as soon as we have the systems and the materials available. Just like with Loihi 1, we're starting at the kind of the small scale - single chip / double chip systems. We built a 768 chip system with Loihi 1, and the Loihi 2 version of that will come around in due course.

IC: Loihi 2 is the first processor publicly confirmed for Intel's first EUV process node, Intel 4. Are there any inherent advantages to the Loihi design that makes it beneficial from a process node optimization point of view?

MD: Neuromorphic Computing, more so than pretty much any other types of computer architecture, really needs Moore's law. We need tiny transistors, and we need tiny storage elements to represent all the neural and the synaptic states. This is really one of the most critical aspects of the commercial economic viability of this technology. So for that reason, we always want to be on the very bleeding edge of Moore's law to get the greatest capacity in the network, in a single chip, and not have to go to 768 chips to support a modest size workload. So that's why, fundamentally, we're at the leading edge of the process technology.

EUV simplifies the design rules, which actually is really great for us because we've been able to iteratively advance the design. We’ve been able to quickly iterate test chips and as the process has been evolving, we've been able to evolve the design and loop feedback from the silicon teams, so it's been great for that.

IC: You say pre-production of Intel 4 is used - how much is there silicon in the lab vs simulation?

MD: We have chips in the lab! In fact, as of September 30th, they'll be available for our ecosystem partners to actually kick the tires and start using them. But as always, it's the software that's really the slower part to come together. So that being said, we’re not at the final version. This process (Intel 4) is still in development, so we aren't really seeing products. Loihi 2 is a research chip, so there's a different standard of quality and reliability and all these factors that go into releasing products. But it certainly means that the process is healthy enough that we can deploy chips and put them on subsystem boards, and remotely access them, measure their performance, and make them available for people to use. My team has been using these for quite some time, and now we're just flipping the switch and saying our external users can start to use them. But we have a ways to go, and we have more versions of Loihi 2 in the lab - it's an iterative process, and it continues even with this release.

IC: So there won't specifically be one Loihi 2 design? There may be varying themes and features for different steppings?

MD:  For sure. We've frozen the architecture in a sense, and we have most of the capabilities all implemented and done. But yes, we're not completely done with the final version that we can deploy with the all the final properties we want.

IC: I think the two big specifications that most of our readers will be interested in is the die size – going down from 60mm2 in Loihi 1 to 31 mm2 in Loihi 2. Not only that, but neuron counts increase from 130,000 to a million. What else does Loihi 2 bring to the table?

MD: So the biggest change is a huge amount of programmability that we've added to the chip. We were kind of surprised with the applications and the algorithms that started getting developed and quantified with Loihi we found that the more complex the neuron model got, the more application value we could measure. So could we could see that there was a school of thought that the particular kind of neural characteristics of the neuron model don't matter that much - what matters more is the parallel assembly of all these neurons, and then that emergent behavior I was describing earlier.

Since then, we've found that the fixed function elements in Loihi have proved to be a limitation for supporting a broader range of applications or different types of algorithms. Some of these get pretty technical but as an example, one neuron model that we wanted to support (but couldn't) with Loihi is an oscillatory neuron model. When you kick it with one of these events or spikes, it doesn't just decay away like normal, but it actually oscillates, kind of like a pendulum. This is thought in neuroscience to have some connection to the way that we have brain rhythms. But in the neuromorphic community, and even in neuroscience, it's not been too well understood exactly how you can computationally use these kind of exotic oscillating neuron models, especially when adding extra little nonlinear mathematical terms which some people study.

So we were exploring that direction, and we found that actually there are great benefits and we can practically construct neural networks with these interesting new bio-inspired neuron models. They effectively can solve the same kind of problems [we’ve been working on], but they can shrink the size of the networks and the number of parameters to solve the same problems. They're just the better model for the particular task that you want to solve. It's those kind of things where, as we saw more and more examples, we realized that it is not a matter of just tweaking the base behavior in Loihi - we really had to go and put in a more general purpose compute, almost like an instruction set and a little microcode executer, that implements individual neurons in a much more flexible way.

So that's been the big change under the hood that we've implemented. We've done that very carefully to not deviate from the basic principles of neuromorphic architectures. It's not a von Neumann processor or something - there's still this great deal of parallelism and locality in the memory, and now we have these opcodes that can get executed so we don't compromise on the energy efficiency as we go to these more complex neuron models.

IC: So is every neuron equal, and can do the same work, or is this functionality split to a small sub-set per core?

MD: All neurons are equal. In Loihi v1, we had one very configurable neuron model - each individual neuron could kind of specify different parameters to be customized to that particular part of the network, and there were some constraints on how diverse you could configure it. The same idea applies, but you can define a couple different [schema], and different neurons can reference and use those different styles in different parts of the network.

IC: One of the big things about Loihi v1 was that it was a single shiny chip which could act on its own, or in Pohoiki Springs there would be 768 chips all in a box. Can you give examples of what sort of workloads run on that single chip, versus the bigger systems? And does that change with Loihi 2?

MD: Fundamentally the kinds of workloads don't necessarily change - that's one of the interesting aspects of neuromorphic architecture. It's similar enough to the brain such that with more and more brain matter the particular types of functions and features that are supported at these different scales don't change that much. For example, one workload we demonstrated is a similarity search function – such as an image database. You might think of it as giving it an example image and you want to query to find the closest match; in the large system, we can scale up and support the largest possible database of images. But on a single chip, you perhaps performed the same thing, just with a much smaller database. And so if you're deploying that, in an edge device, or some kind of mobile drone or something, you may be very limited in a single chip form factor to the types of the varied range of different objects that it could be detected. If you're doing something that's more data center oriented, you would have a much richer space of possibility there.

But this is one area we've improved a lot – in Loihi v1, the effect of bandwidth between the chips proved to be a bottleneck. So we did get congestion, despite this highly sparse style of communication. We're usually not transmitting, and then we only transmit infrequently when there's information to be processed. But the bandwidth offered by the chip-to-chip links in Loihi was so much lower than what we have inside the chip that inevitably it started becoming a bottleneck in that 768 chip system for a lot of workloads. So we've boosted that in Loihi to over 60 times, actually, if you consider all the different factors of the raw circuit speeds, and the compression features we've added now to reduce the need for the bandwidth and to reduce redundancy in that traffic. We've also added a third dimension, so that now we can scale not just planar networks, 2D meshes of chips, but we can actually have radix, and scaling so that we can go into 3D.

IC: With Loihi 2, you're moving some connectivity to Ethernet. Does that simplify some aspects because there's already deep ecosystem based around Ethernet?

MD: The Ethernet is to address another limitation of a different kind that we see with neuromorphic technology. It's actually hard to integrate it into conventional architectures. In Loihi 1, we did a very purist asynchronous interconnect - one that allows us to scale up to these big system sizes that enables, just natively speaking, asynchronous spikes from chip-to-chip. But of course at some point you want to interface this to conventional processors, with conventional data formats, and so that's the motivation to go and put in a standard protocol in there that that allows us to stream standard data formats.  We have some accelerated spike encoding processes on the chip so that as we get real world data streams we can now convert it in a more efficient fast way. So Ethernet is more for integration into conventional systems.

IC: Spiking neural networks are all about instantaneous flashes of data or instructions through the synapses. Can you give us an indication what percentage of neurons and synapses are active at any one instant with a typical workflow? How should we think about that in relation to TDP?

MD: There is a dynamic range of power. Loihi, in a real world workload on a human timescale, would typically operate around 100 milliwatts. If you're computing something that's more abstract computationally, where you don't have to slow it down to human scales, say solving optimization problems, then it’s different. One demonstration we have is that with the German railway network we took an optimization workload and mapped it onto Loihi – for that you just want an answer as fast as possible, or maybe you have a batched up collection of problems to solve. In that case, the power can peak above one watt or so in a single Loihi chip. Loihi 2 will be similar, but we've put so many performance improvements into the design, and we’re reaching upwards of 10 times faster for some workloads. So we could operate Loihi 2 at a fairly high power level, but it’s not that much when we need it for real time/human timescale kind of workloads.

IC: In previous discussions about neuromorphic computing, one of the limitations isn't necessarily the compute from the neuromorphic processor, but finding sensors that can relay data in a spiking neural network format, such as video cameras. To what level is the Intel Neuromorphic team working on that front?

MD: So yes, there’s a definite need to, in some cases, rethink sensing all the way to the sensors themselves. We've seen that with new vision sensors, these emerging event cameras, are fantastic for directly producing spikes that go speak the language of Loihi and another neuromorphic chips. We are certainly collaborating with some of those companies developing those sensors. There's also a big space of interesting possibility there for a really tight coupling between the neuromorphic chips and the sensors themselves.

Generally though, what matters more than just the format of the spikes is that the base for the data stream has to be a temporal one, rather than static snapshots. That's the problem with a conventional camera for neuromorphic interfacing, we need more of an evolving temporal signal. So audio waveforms, for example, are great for processing.

In that case, we can look at bio-inspired approaches. For audio, this is an example where with the more generalized kind of neuron models in Loihi, we can model the cochlea (ear). In the cochlea, there is a biological structure that converts waveforms into spikes, and making a spectral transform of spikes looking at different frequencies. That's the kind of thing where that the sensor part of it, we can still use a standard microphone, but we're going to change the way that we convert these signal streams that are fundamentally time varying into these discrete spike outputs.

But yeah, sensors are a very important part of it. Tactile sensors are another example where we're collaborating with people producing these new types of tactile sensors, which clearly you want to be event based. You don't want to read out all of the tactile sensors in a single synchronous time snapshot - you want to know when you've hit something and respond immediately. So here's another example where the bio inspired approach to sensing tactile sensation is really good for a neuromorphic interface.

IC: So would it be fair to say that neuromorphic is perhaps best for interrupt based sensing, rather than polling based?

MD: In a very conventional computing mindset, absolutely! That's exactly it.

IC: How close is Loihi 2 to a 'biological model'?

MD: I think our guiding approach is to understand the principles that come from the study of neuroscience, but not to copy feature by feature. So we've added a bit of programmability into our neuron models, for example - biology doesn't have programmable neurons. But the reason we've done that is so that we can support the diversity of neuron models that we find in the brain. It's no coincidence and not a just a quirk of evolution that we have 1000s of different unique neuron types in the brain. It means that not all one size fits all. So we can try to design a chip that has 1000 different hard coded circuits, and each one is trying to mimic exactly a particular neuron - or we can say we have one general type, but with programmability. Ultimately we need diversity, that's the lesson that comes from evolution, but let's give our chip the feature set that lets us cover a range of neuron models.

IC: Is that kind of like mixing an FPGA with your model?

MD: Yeah! Actually in many ways that is the most close parallel to a neuromorphic architecture.

IC: One of the applications of Loihi has been optimization problems - sudoku, train scheduling, puzzles. Could it also be applied to combative applications, such as chess or Go? How would the neuromorphic approach differ to the 'more traditional' machine learning?

MD: That’s a really interesting direction for research that we haven't gone deeply into yet. If you look at the best performing, adversarial type of reinforcement-based learning approaches that have proven so successful there, the key is to be able to run many, many, many different trials, vastly accelerated to what a human brain could process. The algorithm then learns from all of that. This is a domain where it starts being a little distant from what we're focused on in Neuromorphic, because we're often looking at human timescales, by and large, and processing data streams that are arriving in real time and adapting to that in a way that our brain adapts.

So if we're trying to learn in a superhuman way, such as all kinds of correlations in the game of Go that human brains struggle to achieve, I could see neuromorphic models being good for that. But we're going to have to go work on that acceleration aspect, and have them speed up by vast numbers. But I think there's definitely a future direction - I think this is something that eventually we will get to, and particularly deploying evolutionary approaches for that where we can use vast parallelism similar to how in nature it evolves different networks in a kind of distributed adversarial game to evolve the best solution. We can absolutely apply those same techniques, neuromorphically, and that would be a guiding motivation to build really big neuromorphic systems in the future - not to achieve human brain sales, but to go well beyond human brain scale, to evolve into the best performing agent.

IC: In normal computing, we have the concept of IPC - instructions per clock. What's the equivalent metric in Neuromorphic computing, and how does Loihi 2 compare to Loihi 1?

MD: That’s a great question, and it gets into some nuances of this field. There are metrics that we can look at, things like the number of synaptic operations that can be processed per unit of time, or similar such as max per second, or the max per second per watt, or synaptic energy, neuron updates per time step, or per unit of time, and the numbers of neurons that could be updated. In all of those metrics, we've improved Loihi 2 to generally by at least a factor of two faster. As I was saying earlier, it's uniformly better by a big step over Loihi 1.

Now on the other hand, we tend to not really emphasize (at least in our research programme) those particular metrics, because once you start fixating on specific ops and try to optimize for them, you're basically accepting the fact we know what the field wants, and let's go optimize for those. But in the neuromorphic domain, that there's just no clarity yet on exactly what is needed. For a deep learning accelerator, you want to crank the greatest number of operations per second, right? But in the neuromorphic world, a synaptic operation, if you take something as simple as that, should that operation support the propagation delay, which has another parameter? Should it allow the weight that it applies to multiply with a strength that comes along with that spike event? Should the weight evolve in response? Should it change for learning purposes? These are all questions that we're looking at. So before we really fixate on a particular number, we want to really figure out what the right operations are.

So as I say, we've improved certainly Loihi 2 over Loihi 1 by large measures. But I think energy is an example of one that we haven't aggressively optimized. Instead, we've chosen to augment with programmability and speed, because generally what we found with Loihi is that we got huge energy gains purely from the sparsity from the activity and the architecture aspects of the design. At this point, we don't need to take a 1000x improvement and make it 2000x: for this stage of development, 1000x is good enough if we can focus on other benefits. We want balance the benefits a little bit more towards the versatility.

IC: One of the announcements today is on software - you said in our briefing earlier today that there is no sort of universal collaborative framework for neuromorphic computing, and that everybody is kind of doing their own homespun things. Today Intel is introducing a new Lava framework, because traditional TensorFlow/PyTorch or that sort of machine learning doesn't necessarily translate to the neuromorphic world. How is Intel approaching industry collaboration for that standard? Also, will it become part of Intel's oneAPI?

MD: So there are components of Lava we might incorporate into oneAPI, but really with Lava, the software framework that we're releasing, is that it's a beginning of an open source project. It's not the release of some finished product that we're sharing with our partners - we've set up a basic architecture, and we've contributed some software assets that we've developed from the Loihi generation. But really, we see this as building on the learnings of this previous generation to try to provide a collaborative path forward and address the software challenges that still exist and are unsolved. Some of these are very deep research problems. But we need to get more people working together on a common codebase, because until we get that, progress is going to be slow. Also, that's sometimes inevitable - you have to have different groups building on other people's work, extending it, enhancing it, and polishing it to the point that non specialists can come in take some or all of these best methods, that they may have no clue what magic neuroscientist ideas have been optimized, but just understandable libraries wrapped up to the point that they can be applied. So we're not at that stage yet, and it won't be an Intel product - it's going to be an open source Lava project that Intel contributes to.

IC: Speaking on the angle of getting people involved - I know Loihi 2 is an early announcement right now. But what scope is there for Loihi 2 to be on a USB stick, and get into the hands of non-traditional researchers for homebrew use cases?

MD: There's no plan at this point, but we're looking at possibilities for scaling out the availability of Loihi 2 beyond where we are with Loihi 1. But we're taking it step by step, because right now we're only unveiling the first cloud systems that people can start to access. We'll gauge the response and the interest in Lava, and how that lowers the barriers for entry to using the technology. One aspect of Lava that I didn't mention is that people can start using this on their CPU - so they can start developing models, and it will run incredibly slowly compared to what the neuromorphic chip can accelerate, but at least if we get more people using it and this nice dynamic of building and polishing the software occurs, then that will create a motivating case to go and make the hardware more widely available. I certainly hope we get to that point.

IC: If there's one main takeaway about neuromorphic computing that people should after reading and listening to this interview, what should it be?

MD: The future is bright in this field. I'm really very excited by the results we had with that first generation, and Loihi 2 addresses very specific pain points which should just allow it to scale even better. We’ve seen some really impactful application demonstrations that were not possible with that first generation. So stay tuned – there are really fun times to come.

Many thanks to Mike Davies and his team for their time.

Adblock test (Why?)



Technology - Latest - Google News
October 01, 2021 at 01:25AM
https://ift.tt/3zX1LD2

An Interview with Intel Lab's Mike Davies: The Next Generation of Neuromorphic Research - AnandTech
Technology - Latest - Google News
https://ift.tt/2AaD5dD

Ellen Pompeo Says She Cussed Out Denzel Washington On 'Grey's Anatomy' Set - HuffPost

Ellen Pompeo said she harshly snapped back at Denzel Washington when he directed an episode of “Grey’s Anatomy.”

Washington, a two-time Oscar-winning actor, was helming 2016′s “The Sound of Silence” episode when Pompeo said she ad-libbed the line, “Look at me when you apologize. Look at me,” in an exchange with another actor. The off-script moment irritated Washington. 

“Denzel went ham on my ass,” she told former co-star Patrick Dempsey on her “Tell Me” podcast, posted Wednesday. “He was like, ‘I’m the director. Don’t you tell him what to do.’ And I was like, ’Listen, motherfucker, this is my show. This is my set. Who are you telling? Like you barely know where the bathroom is.’”

The two didn’t speak for a while. Pompeo said she told Washington’s wife, Pauletta Pearson, that Washington yelled at her and “I’m not OK with him.”

But the two eventually patched things up, and Pompeo said she has the “utmost respect” for Washington. The passion of performers, she added, creates the “magic.”

“That’s where you get the good stuff,” she said. 

Another juicy bit of behind-the-scenes “G.A.” intrigue emerged recently. A producer said in a tell-all book about the series that Dempsey gave castmates “PTSD” and was “terrorizing the set” during his acrimonious departure from the show in 2015.

Adblock test (Why?)



Entertainment - Latest - Google News
September 30, 2021 at 06:19PM
https://ift.tt/3F246QW

Ellen Pompeo Says She Cussed Out Denzel Washington On 'Grey's Anatomy' Set - HuffPost
Entertainment - Latest - Google News
https://ift.tt/2RiDqlG

Fairphone’s latest sustainable smartphone comes with a five-year warranty - The Verge

Fairphone, the manufacturer focused on making easy to repair smartphones made out of ethically sourced materials, just took the wraps off its fourth-generation handset. The Fairphone 4 uses a modular design that’s similar to the company’s previous phones, only now with more powerful internals, a five-year warranty, and a promise of two major Android updates and software support until the end of 2025. Prices start at €579 / £499 for the phone, which will ship on October 25th.

I’ve been using the Fairphone 4 for a couple of days as my primary phone, and while I’m not ready to give a final verdict just yet, it feels like a big step forward compared to the dated designs and low-power components found in the company’s previous phones. Stay tuned for my full review.

Fairphone’s ambition is to produce a more ethical alternative to modern smartphones. That means making a device that’s ethically sourced using sustainable materials before providing the software support and warranty to make it useable for as long as possible. Although Fairphone is only guaranteeing software support until the end of 2025, it has ambitions to extend this as far as 2027. In an ideal world, Fairphone would also like to eventually release 2024’s Android 15 as an update to the phone.

Normally, the specs of Fairphone’s devices are secondary to its ethical considerations, but unlike its previous phones, the Fairphone 4 is competitive with other mid-range Android handsets. The 5G handset is powered by Qualcomm’s Snapdragon 750G processor, and that’s paired with either 6 or 8GB of RAM and 128 or 256GB of internal storage, expandable via microSD. It’s powered by a 3,905mAh removable battery, and the display is a 6.3-inch 1080p LCD panel.

There are two rear cameras — a 48-megapixel main camera and a 48-megapixel ultrawide — and a single 25-megapixel selfie camera. The main rear camera is equipped with optical image stabilization and can record at up to 4K / 30fps.

A notable downside compared to previous Fairphones is that the Fairphone 4 no longer includes a 3.5mm headphone jack, a choice that feels at odds with the company’s otherwise customer-first approach. Fairphone tells me it made this decision in order to be able to offer an official IP rating for dust and water resistance, which was missing from the company’s previous phones. It’s only IP54, which means it’s protected from light splashes rather than full submersion, but that’s impressive in light of its removable rear cover and modular design.

Regarding its modularity, Fairphone is selling eight repair modules for the phone, which include replacement displays, batteries, back covers, USB-C ports, loudspeakers, earpieces, rear cameras, and selfie cameras. All of these are easily removable using a standard Philips head screwdriver, which means customers should be able to carry out a lot of repairs themselves. But, if you need to turn to a professional, Fairphone says its spare parts are readily available for local repair shops to buy and use themselves.

Fairphone’s previous two phones are the only devices to have received perfect repairability scores from iFixit, and the company tells me it believes the Fairphone 4 is even more repairable.

The hope is for these spare parts to be available until at least 2027. Fairphone has a good track record with previous devices, telling me it still has parts in stock for the six-year-old Fairphone 2, two years after the last handset was sold. But product manager Miquel Ballester concedes that the company has run out of certain parts for that model.

So too does Fairphone have a solid record on the software side of providing major Android updates for its phones… eventually. Earlier this year, the company officially released its Android 9 update for the Fairphone 2, a device that originally launched with Android 5. It may have come almost three years after Android 9’s original release, but it means that the phone continues to run an officially supported version of Google’s operating system. It bodes well for Fairphone’s support aspirations for the Fairphone 4, although it will have to contend with the fact that Qualcomm only officially supports its chipsets for three major OS updates and four years of security updates, Ars Technica reports.

In terms of materials, the Fairphone 4 is made using Fairtrade-certified gold; responsibly sourced aluminum and tungsten; and recycled tin, rare earth minerals, and plastic (including its rear panel, which is 100 percent recycled polycarbonate). The company has various initiatives to improve the working conditions of miners and factory workers involved in the supply chains for its devices. Fairphone also claims that the Fairphone 4 is the “first electronic waste neutral handset” because it’ll recycle one phone or an equal amount of e-waste for each device sold.

The Fairphone 4 is available to preorder today in Europe and should ship starting October 25th. The model with 6GB of RAM and 128GB of storage costs €579 / £499, while the step-up model with 8GB of RAM and 256GB of storage retails for €649 / £569. Unfortunately, there’s no sign of a US release: Fairphone says it’s interested but that it’s focusing on Europe for the time being.

Adblock test (Why?)



Technology - Latest - Google News
September 30, 2021 at 05:30PM
https://ift.tt/3mfA17I

Fairphone’s latest sustainable smartphone comes with a five-year warranty - The Verge
Technology - Latest - Google News
https://ift.tt/2AaD5dD

Featured Post

iOS 17 just got exciting with report that ‘most requested features' are on the way - Macworld

For the past several months, we’ve assumed that the next iOS release would be light on features and heavy on performance improvements and ...

Popular Posts