Connect with us

Technology

The enterprise verdict on AI models: Why open source will win

Published

on

The enterprise verdict on AI models: Why open source will win

Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More


The enterprise world is rapidly growing its usage of open source large language models (LLMs), driven by companies gaining more sophistication around AI – seeking greater control, customization, and cost efficiency. 

While closed models like OpenAI’s GPT-4 dominated early adoption, open source models have since closed the gap in quality, and are growing at least as quickly in the enterprise, according to multiple VentureBeat interviews with enterprise leaders.

This is a change from earlier this year, when I reported that while the promise of open source was undeniable, it was seeing relatively slow adoption. But Meta’s openly available models have now been downloaded more than 400 million times, the company told VentureBeat, at a rate 10 times higher than last year, with usage doubling from May through July 2024. This surge in adoption reflects a convergence of factors – from technical parity to trust considerations – that are pushing advanced enterprises toward open alternatives.

Advertisement

“Open always wins,” declares Jonathan Ross, CEO of Groq, a provider of specialized AI processing infrastructure that has seen massive uptake of customers using open models. “And most people are really worried about vendor lock-in.”

Even AWS, which made a $4 billion investment in closed-source provider Anthropic – its largest investment ever – acknowledges the momentum. “We are definitely seeing increased traction over the last number of months on publicly available models,” says Baskar Sridharan, AWS’ VP of AI & Infrastructure, which offers access to as many models as possible, both open and closed source, via its Bedrock service. 

The platform shift by big app companies accelerates adoption

It’s true that among startups or individual developers, closed-source models like OpenAI still lead. But in the enterprise, things are looking very different. Unfortunately, there is no third-party source that tracks the open versus closed LLM race for the enterprise, in part because it’s near impossible to do: The enterprise world is too distributed, and companies are too private for this information to be public. An API company, Kong, surveyed more than 700 users in July. But the respondents included smaller companies as well as enterprises, and so was biased toward OpenAI, which without question still leads among startups looking for simple options. (The report also included other AI services like Bedrock, which is not an LLM, but a service that offers multiple LLMs, including open source ones — so it mixes apples and oranges.)

Image from a report from the API company, Kong. Its July survey shows ChatGPT still winning, and open models Mistral, Llama and Cohere still behind.

But anecdotally, the evidence is piling up. For one, each of the major business application providers has moved aggressively recently to integrate open source LLMs, fundamentally changing how enterprises can deploy these models. Salesforce led the latest wave by introducing Agentforce last month, recognizing that its customer relationship management customers needed more flexible AI options. The platform enables companies to plug in any LLM within Salesforce applications, effectively making open source models as easy to use as closed ones. Salesforce-owned Slack quickly followed suit.

Oracle also last month expanded support for the latest Llama models across its enterprise suite, which includes the big enterprise apps of ERP, human resources, and supply chain. SAP, another business app giant, announced comprehensive open source LLM support through its Joule AI copilot, while ServiceNow enabled both open and closed LLM integration for workflow automation in areas like customer service and IT support.

Advertisement

“I think open models will ultimately win out,” says Oracle’s EVP of AI and Data Management Services, Greg Pavlik. The ability to modify models and experiment, especially in vertical domains, combined with favorable cost, is proving compelling for enterprise customers, he said.

A complex landscape of “open” models

While Meta’s Llama has emerged as a frontrunner, the open LLM ecosystem has evolved into a nuanced marketplace with different approaches to openness. For one, Meta’s Llama has more than 65,000 model derivatives in the market. Enterprise IT leaders must navigate these, and other options ranging from fully open weights and training data to hybrid models with commercial licensing.

Mistral AI, for example, has gained significant traction by offering high-performing models with flexible licensing terms that appeal to enterprises needing different levels of support and customization. Cohere has taken another approach, providing open model weights but requiring a license fee – a model that some enterprises prefer for its balance of transparency and commercial support.

This complexity in the open model landscape has become an advantage for sophisticated enterprises. Companies can choose models that match their specific requirements – whether that’s full control over model weights for heavy customization, or a supported open-weight model for faster deployment. The ability to inspect and modify these models provides a level of control impossible with fully closed alternatives, leaders say. Using open source models also often requires a more technically proficient team to fine-tune and manage the models effectively, another reason enterprise companies with more resources have an upper hand when using open source.

Advertisement

Meta’s rapid development of Llama exemplifies why enterprises are embracing the flexibility of open models. AT&T uses Llama-based models for customer service automation, DoorDash for helping answer questions from its software engineers, and Spotify for content recommendations. Goldman Sachs has deployed these models in heavily regulated financial services applications. Other Llama users include Niantic, Nomura, Shopify, Zoom, Accenture, Infosys, KPMG, Wells Fargo, IBM, and The Grammy Awards. 

Meta has aggressively nurtured channel partners. All major cloud providers embrace Llama models now. “The amount of interest and deployments they’re starting to see for Llama with their enterprise customers has been skyrocketing,” reports Ragavan Srinivasan, VP of Product at Meta, “especially after Llama 3.1 and 3.2 have come out. The large 405B model in particular is seeing a lot of really strong traction because very sophisticated, mature enterprise customers see the value of being able to switch between multiple models.” He said customers can use a distillation service to create derivative models from Llama 405B, to be able to fine tune it based on their data. Distillation is the process of creating smaller, faster models while retaining core capabilities. 

Indeed, Meta covers the landscape well with its other portfolio of models, including the Llama 90B model, which can be used as a workhorse for a majority of prompts, and 1B and 3B, which are small enough to be used on device. Today, Meta released “quantized” versions of those smaller models. Quantization is another process that makes a model  smaller, allowing less power consumption and faster processing. What makes these latest special is that they were quantized during training, making them more efficient than other industry quantized knock-offs – four times faster at token generation than their originals, using a fourth of the power.

Technical capabilities drive sophisticated deployments

The technical gap between open and closed models has essentially disappeared, but each shows distinct strengths that sophisticated enterprises are learning to leverage strategically. This has led to a more nuanced deployment approach, where companies combine different models based on specific task requirements.

Advertisement

“The large, proprietary models are phenomenal at advanced reasoning and breaking down ambiguous tasks,” explains Salesforce EVP of AI, Jayesh Govindarajan. But for tasks that are light on reasoning and heavy on crafting language, for example drafting emails, creating campaign content, researching companies, “open source models are at par and some are better,” he said. Moreover, even the high reasoning tasks can be broken into sub-tasks, many of which end up becoming language tasks where open source excels, he said. 

Intuit, the owner of accounting software Quickbooks, and tax software Turbotax, got started on its LLM journey a few years ago, making it a very early mover among Fortune 500 companies. Its implementation demonstrates a sophisticated approach. For customer-facing applications like transaction categorization in QuickBooks, the company found that its fine-tuned LLM built on Llama 3 demonstrated higher accuracy than closed alternatives. “What we find is that we can take some of these open source models and then actually trim them down and use them for domain-specific needs,” explains Ashok Srivastava, Intuit’s chief data officer. They “can be much smaller in size, much lower and latency and equal, if not greater, in accuracy.”

The banking sector illustrates the migration from closed to open LLMs. ANZ Bank, a bank that serves Australia and New Zealand, started out using OpenAI for rapid experimentation. But when it moved to deploy real applications, it dropped OpenAI in favor of fine-tuning its own Llama-based models, to accommodate its specific financial use cases, driven by needs for stability and data sovereignty. The bank published a blog about the experience, citing the flexibility provided by Llama’s multiple versions, flexible hosting, version control, and easier rollbacks. We know of another top-three U.S. bank that also recently moved away from OpenAI.

It’s examples like this, where companies want to leave OpenAI for open source, that have given rise to things like “switch kits” from companies like PostgresML that make it easy to exit OpenAI and embrace open source “in minutes.”

Advertisement

Infrastructure evolution removes deployment barriers

The path to deploying open source LLMs has been dramatically simplified. Meta’s Srinivasan outlines three key pathways that have emerged for enterprise adoption:

  1. Cloud Partner Integration: Major cloud providers now offer streamlined deployment of open source models, with built-in security and scaling features.
  2. Custom Stack Development: Companies with technical expertise can build their own infrastructure, either on-premises or in the cloud, maintaining complete control over their AI stack – and Meta is helping with its so-called Llama Stack.
  3. API Access: For companies seeking simplicity, multiple providers now offer API access to open source models, making them as easy to use as closed alternatives. Groq, Fireworks, and Hugging Face are examples. All of them are able to provide you an inference API, a fine-tuning API, and basically anything that you would need or you would get from a proprietary provider.

Safety and control advantages emerge

The open source approach has also – unexpectedly – emerged as a leader in model safety and control, particularly for enterprises requiring strict oversight of their AI systems. “Meta has been incredibly careful on the safety part, because they’re making it public,” notes Groq’s Ross. “They actually are being much more careful about it. Whereas with the others, you don’t really see what’s going on and you’re not able to test it as easily.”

This emphasis on safety is reflected in Meta’s organizational structure. Its team focused on Llama’s safety and compliance is large relative to its engineering team, Ross said, citing conversations with the Meta a few months ago. (A Meta spokeswoman said the company does not comment on personnel information). The September release of Llama 3.2 introduced Llama Guard Vision, adding to safety tools released in July. These tools can:

  • Detect potentially problematic text and image inputs before they reach the model
  • Monitor and filter output responses for safety and compliance

Enterprise AI providers have built upon these foundational safety features. AWS’s Bedrock service, for example, allows companies to establish consistent safety guardrails across different models. “Once customers set those policies, they can choose to move from one publicly available model to another without actually having to rewrite the application,” explains AWS’ Sridharan. This standardization is crucial for enterprises managing multiple AI applications.

Databricks and Snowflake, the leading cloud data providers for enterprise, also vouch for Llama’s safety: “Llama models maintain the “highest standards of security and reliability,” said Hanlin Tang, CTO for Neural Networks

Intuit’s implementation shows how enterprises can layer additional safety measures. The company’s GenSRF (security, risk and fraud assessment) system, part of its “GenOS” operating system, monitors about 100 dimensions of trust and safety. “We have a committee that reviews LLMs and makes sure its standards are consistent with the company’s principles,” Intuit’s Srivastava explains. However, he said these reviews of open models are no different than the ones the company makes for closed-sourced models.

Advertisement

Data provenance solved through synthetic training

A key concern around LLMs is about the data they’ve been trained on. Lawsuits abound from publishers and other creators, charging LLM companies with copyright violation. Most LLM companies, open and closed, haven’t been fully transparent about where they get their data. Since much of it is from the open web, it can be highly biased, and contain personal information. 

Many closed sourced companies have offered users “indemnification,” or protection against legal risks or claims lawsuits as a result of using their LLMs. Open source providers usually do not provide such indemnification. But lately this concern around data provenance seems to have declined somewhat. Models can be grounded and filtered with fine-tuning, and Meta and others have created more alignment and other safety measures to counteract the concern. Data provenance is still an issue for some enterprise companies, especially those in highly regulated industries, such as banking or healthcare. But some experts suggest these data provenance concerns may be resolved soon through synthetic training data. 

“Imagine I could take public, proprietary data and modify them in some algorithmic ways to create synthetic data that represents the real world,” explains Salesforce’s Govindarajan. “Then I don’t really need access to all that sort of internet data… The data provenance issue just sort of disappears.”

Meta has embraced this trend, incorporating synthetic data training in Llama 3.2’s 1B and 3B models

Advertisement

Regional patterns may reveal cost-driven adoption

The adoption of open source LLMs shows distinct regional and industry-specific patterns. “In North America, the closed source models are certainly getting more production use than the open source models,” observes Oracle’s Pavlik. “On the other hand, in Latin America, we’re seeing a big uptick in the Llama models for production scenarios. It’s almost inverted.”

What is driving these regional variations isn’t clear, but they may reflect different priorities around cost and infrastructure. Pavlik describes a scenario playing out globally: “Some enterprise user goes out, they start doing some prototypes…using GPT-4. They get their first bill, and they’re like, ‘Oh my god.’ It’s a lot more expensive than they expected. And then they start looking for alternatives.”

Market dynamics point toward commoditization

The economics of LLM deployment are shifting dramatically in favor of open models. “The price per token of generated LLM output has dropped 100x in the last year,” notes venture capitalist Marc Andreessen, who questioned whether profits might be elusive for closed-source model providers. This potential “race to the bottom” creates particular pressure on companies that have raised billions for closed-model development, while favoring organizations that can sustain open source development through their core businesses.

“We know that the cost of these models is going to go to zero,” says Intuit’s Srivastava, warning that companies “over-capitalizing in these models could soon suffer the consequences.” This dynamic particularly benefits Meta, which can offer free models while gaining value from their application across its platforms and products.

Advertisement

A good analogy for the LLM competition, Groq’s Ross says, is the operating system wars. “Linux is probably the best analogy that you can use for LLMs.” While Windows dominated consumer computing, it was open source Linux that came to dominate enterprise systems and industrial computing. Intuit’s Srivastava sees the same pattern: ‘We have seen time and again: open source operating systems versus non open source. We see what happened in the browser wars,” when open source Chromium browsers beat closed models.

Walter Sun, SAP’s global head of AI, agrees: “I think that in a tie, people can leverage open source large language models just as well as the closed source ones, that gives people more flexibility.” He continues: “If you have a specific need, a specific use case… the best way to do it would be with open source.”

Some observers like Groq’s Ross believe Meta may be in a position to commit $100 billion to training its Llama models, which would exceed the combined commitments of proprietary model providers, he said. Meta has an incentive to do this, he said, because it is one of the biggest beneficiaries of LLMs. It needs them to improve intelligence in its core business, by serving up AI to users on Instagram, Facebook, Whatsapp. Meta says its AI touches 185 million weekly active users, a scale matched by few others. 

This suggests that open source LLMs won’t face the sustainability challenges that have plagued other open source initiatives. “Starting next year, we expect future Llama models to become the most advanced in the industry,” declared Meta CEO Mark Zuckerberg in his July letter of support for open source AI. “But even before that, Llama is already leading on openness, modifiability, and cost efficiency.”

Advertisement

Specialized models enrich the ecosystem

The open source LLM ecosystem is being further strengthened by the emergence of specialized industry solutions. IBM, for instance, has released its Granite models as fully open source, specifically trained for financial and legal applications. “The Granite models are our killer apps,” says Matt Candy, IBM’s global managing partner for generative AI. “These are the only models where there’s full explainability of the data sets that have gone into training and tuning. If you’re in a regulated industry, and are going to be putting your enterprise data together with that model, you want to be pretty sure what’s in there.”

IBM’s business benefits from open source, including from wrapping its Red Hat Enterprise Linux operating system into a hybrid cloud platform, that includes usage of the Granite models and its InstructLab, a way to fine-tune and enhance LLMs. The AI business is already kicking in. “Take a look at the ticker price,” says Candy. “All-time high.”

Trust increasingly favors open source

Trust is shifting toward models that enterprises can own and control. Ted Shelton, COO of Inflection AI, a company that offers enterprises access to licensed source code and full application stacks as an alternative to both closed and open source models, explains the fundamental challenge with closed models: “Whether it’s OpenAI, it’s Anthropic, it’s Gemini, it’s Microsoft, they are willing to provide a so-called private compute environment for their enterprise customers. However, that compute environment is still managed by employees of the model provider, and the customer does not have access to the model.” This is because the LLM owners want to protect proprietary elements like source code, model weights, and hyperparameter training details, which can’t be hidden from customers who would have direct access to the models. Since much of this code is written in Python, not a compiled language, it remains exposed.

This creates an untenable situation for enterprises serious about AI deployment. “As soon as you say ‘Okay, well, OpenAI’s employees are going to actually control and manage the model, and they have access to all the company’s data,’ it becomes a vector for data leakage,” Shelton notes. “Companies that are actually really concerned about data security are like ‘No, we’re not doing that. We’re going to actually run our own model. And the only option available is open source.’”

Advertisement

The path forward

While closed-source models maintain a market share lead for simpler use cases, sophisticated enterprises increasingly recognize that their future competitiveness depends on having more control over their AI infrastructure. As Salesforce’s Govindarajan observes: “Once you start to see value, and you start to scale that out to all your users, all your customers, then you start to ask some interesting questions. Are there efficiencies to be had? Are there cost efficiencies to be had? Are there speed efficiencies to be had?”

The answers to these questions are pushing enterprises toward open models, even if the transition isn’t always straightforward. “I do think that there are a whole bunch of companies that are going to work really hard to try to make open source work,” says Inflection AI’s Shelton, “because they got nothing else. You either give in and say a couple of large tech companies own generative AI, or you take the lifeline that Mark Zuckerberg threw you. And you’re like: ‘Okay, let’s run with this.’”


Source link
Continue Reading
Advertisement
Click to comment

You must be logged in to post a comment Login

Leave a Reply

Technology

Samsung scientists are working on a new type of memory that could bring RAM like speeds and SSD capacities together

Published

on

Samsung scientists are working on a new type of memory that could bring RAM like speeds and SSD capacities together

Samsung has used advanced computer modeling to accelerate the development of Selector-Only Memory (SOM), a new memory technology that combines non-volatility with DRAM-like read/write speeds and stackability.

Building on the company’s earlier research in the field, SOM is based on cross-point memory architectures, similar to phase-change memory and resistive RAM (RRAM), where stacked arrays of electrodes are used. Typically, these architectures require a selector transistor or diode to address specific memory cells and prevent unintended electrical pathways.

Source link

Advertisement

Continue Reading

Technology

A major contributor to India’s growth story- The Week

Published

on

A major contributor to India’s growth story- The Week

The online gaming industry in India has been on a transformational journey with a promising growth trajectory, despite regulatory ambiguities and a high tax rate. Not only has it been a major contributor to the broader media & entertainment space but has also become an integral component of the Animation, Visual Effects, Gaming & Comics (AVGC) sector in India, drawing significant government support, with many states crafting their own AVGC policies to give a boost to the sector.

With over 1,400 homegrown online gaming companies, India’s online gaming sector is uniquely positioned to support the goals of our country’s economic growth ambitions, attract large foreign investments, generate sizeable employment, and spur innovation. Consequently, it has the potential and the necessary elements to become a global supplier and establish itself to become India’s soft power on the global front.

India hosts the second-largest community of gamers globally and has become a popular choice in the entertainment sector, which clearly reflects a transformative shift in the entertainment consumption pattern. With this being said, the online gaming industry goes beyond the premises of entertainment; it opens up opportunities in various other allied industries such as UI/UX design, data engineering, development, programming, testing, sales, branding & marketing, etc. It also fuels innovation in emerging sectors like AI/ML, cybersecurity & cloud, and fintech.

Currently, this sector alone provides over 1 lakh jobs to the skilled workforce of the country and is expected to add 1.5 lakh more by 2025. Further, with only 31% of the rural population using the internet compared to 67% of urban residents, as per the India Inequality Report 2022, there is a significant economic opportunity to increase internet access and digital inclusion in rural areas.

Advertisement

A recent report by the EGROW foundation and Primus Partners states that there has been a 20-fold increase in the workforce between 2018-2023, with a 97.56% compounding annual growth rate. In terms of workforce participation, the industry has significant male participation and was mostly viewed as a male-dominant industry for the longest time. But what truly stands out is the evolving gender dynamics in this space, with female workforce participation far outpacing male workforce participation, achieving a massive 103.15% CAGR in the same period.

Moreover, there has been a steady increase in female participation in gaming. As of 2022, about 43% of women engaged in online gaming daily, with most female participation from non-metro cities. Furthermore, the sector not only recognizes the contribution that women bring to the creative and business processes but also fosters a more inclusive environment for them to thrive. This is evident in the surge in female gaming content creators and streamers in the country.

The findings of the report also highlight the sector’s contribution to the AVGC industry, which is projected to rise 68% by 2026. The government has provided much-needed impetus to the industry by charting out a forward-looking path, constituting a task force, and setting up the first National Centre of Excellence solely dedicated to the AVGC industry.

However, for the industry to thrive and enter the next phase of growth, the government must come out with a national AVGC policy that has been in the works for some time.

Advertisement

Despite the tremendous growth seen by the industry, certain concerns pertaining to excessive screen time, addiction and financial fraud remain. This becomes even more critical in the context of teenagers and young adults who need to be made aware of responsible gaming practices. In this regard, the recent installation of ‘Beware of Smartphone Zombies’ signboards in Bengaluru is a stark reminder of the growing epidemic of digital distraction. While some of these concerns are being addressed by the industry, more can be done to safeguard vulnerable consumers. For instance, to limit exposure to screen time and mitigate financial risk for the consumer, features such as time limits, monetary limitations and exclusions have been introduced by several gaming platforms with the aid of technology.

Further online gaming platforms often require personal information such as name, age, contact details, and in case of real money gaming, also financial information. With this comes the risk of data breaches and related concerns such as identity theft leading to financial fraud. In this context, Know Your Customer (KYC) procedures play a critical role in helping protect both consumers as well as businesses from fraud. Further, online gaming intermediaries are required to process and store digital personal and non-personal data in compliance with the applicable data protection laws of India. However, until the Digital Personal Data Protection Act comes into force, this remains a voluntary effort.

Last but not the least, a sector which holds substantial economic promise deserves regulatory backing and clarity. For much of its existence, the industry has operated self-sufficiently, wherein the collective efforts of the industry have led the way for a more robust, responsible, and accountable ecosystem. However, regulatory ambiguities and uncertainties have time and again created roadblocks for Indian gaming startups, and therefore, it is necessary that regulatory clarity be provided, and as a first step, the amended IT rules be implemented.

With an encouraging regulatory environment, the online gaming industry, which has seen a 27.45% CAGR between 2019 to 2022, in its contribution towards the country’s GDP, can further enhance India’s growth story and solidify its position as a disrupter in the global gaming landscape.

Advertisement

(The author is a Member of Parliament, Rajya Sabha, and former Minister of State for GAD, Education, Health, Maharashtra).

The opinions expressed in this article are those of the author and do not purport to reflect the opinions or views of THE WEEK. 

Source link

Continue Reading

Technology

Venom, Joker, and the year of supervillain cinema

Published

on

Venom, Joker, and the year of supervillain cinema
Venom smiles toothily in a still image from the movie Venom: The Last Dance.
Venom: The Last Dance Sony Pictures / Sony Pictures

Mark Millar’s limited series Wanted, loosely adapted in 2008 into an atrocious movie, imagined a dystopian world where all the superheroes are dead and the supervillains have won. That’s kind of how the multiplex feels right now. Comic-book cinema, which towered over the competition a mere five years ago (it reached its popular peak in 2019, the year of Avengers: Endgame and Joker), has entered a state of ongoing commercial decline. Capes and cowls are no longer a sure thing at the box office; increasingly, it feels like we’ve stepped into a post-superhero age. And in the absence of the virtuously costumed, it’s supervillains — and antiheroes — who have fought for dominance over the screens of 2024.

This weekend, for example, marks the theatrical return of Venom, the erstwhile Spider-Man arch-nemesis, again divorced of any relationship to Marvel’s friendly neighborhood web-slinger. Venom: The Last Dance, which just opened in theaters everywhere, rounds out a whole trilogy of starring vehicles for Tom Hardy’s take on hapless journalist Eddie Brock and the trash-talking, long-tongued extraterrestrial who’s made a home inside his bulky body. 

A man in white suit smiles in Joker: Folie a Deux.
Warner Bros.

Need another fix of bad? The Last Dance arrives on the heels of Joker: Folie à Deux, the majorly underperforming musical sequel to Todd Phillips’ origin story for the most infamous madman from Batman’s gallery of rogues, the Clown Prince of Crime. And it anticipates another Sony spotlight for a Spidey foe, Kraven the Hunter, which is due this Christmas and belongs to the same weird, misbegotten franchise of Spider-Man movies without Spider-Man as the Venom series and this past spring’s baffling bit-player flop Madame Web. Hell, even the one bona fide comic-book-movie hit of the year, Deadpool & Wolverine, stars a character who began his fictional life as a villain, a quipping adversary of various X teams.

Not so long ago, any of these characters getting their very own movie would have been inconceivable. The mere existence of Kraven the Hunter is proof of how deeply Hollywood bought into the lie that anything Marvel- or DC-related could be a giant hit. Starring vehicles for supervillains feels like the natural next step (or maybe the last step, the point of termination) for a cash-cow genre that’s looked to back issues and more obscure corners of comicdom for available source material. You don’t get this year’s crop of bad-guy spectacles without the previous decade’s experiments in making second stringers into A-listers. There would likely be no Venom trilogy without the success of Guardians of the Galaxy or Suicide Squad.

Venom in the Amazing Spider-Man Issue #300
Marvel Comics

To some extent, superhero cinema has worked back around to the ’90s, when the genre was basically Batman sequels and adaptations of cult comics like The Crow and Tank Girl and Judge Dredd. That was also the era when the big two publishers were lining up their own starring vehicles for the heavies of their respective universes. Again, Venom and Deadpool were both villains before they proved popular enough to get the antihero makeover, and to headline their own limited and ongoing series. In truth, this was always kind of a letdown. Venom, that slobbering rage monster, made for a pretty scary Spidey rival. Softening him into an “edgy” vigilante, a so-called “lethal enforcer,” was a waste of a good adversary.

Tom Hardy as Eddie Brock with the Venom symbiote.
Sony Pictures

This year’s unlikely supervillain movies suffer from a similar problem. They soften and brighten characters whose whole appeal was their rough edges and their darkness. The Venom movies are not without their pleasures, most of them courtesy of Hardy’s valiant effort to forge a screwball buddy comedy out of the symbiotic relationship between Eddie and his alien guest. But Venom has always been cooler as a villain, a vengeful anti-Spider-Man, and the movies never approach the fearsomeness that made him such a popular character in the first place. Imagine flashing back to 1988 and telling a reader that not only would Venom one day get his own trilogy of movies but that he’d be reduced to a one-man Midnight Run, a glorified mismatched-partner routine.

Likewise, Joker: Folie à Deux buys so fully into the idea that Joaquin Phoenix’s Arthur Fleck is a misunderstood misfit — destined for infamy only because he was abandoned by the system — that it leeches the character of all his psychotic power. You don’t have to be an incensed fanboy to recognize that turning the Joker into a pitiable sadsack is a delating approach to one of the most flavorfully outsized villains in all of comics. And if Deadpool has been a superhero for a lot longer than he was a supervillain, it’s still odd to see his trilogy of movies undercut their anarchic, sarcastic spirit with warm-and-fuzzies. Who was clamoring for a Deadpool with big feels? Are we really supposed to care about the crime-fighting dreams of a psychotic assassin who breaks the fourth wall at every opportunity?

Two men stand close to each other in Deadpool & Wolverine.
Marvel Studios

The Venom and Joker films — along with Suicide Squad and Morbius and one must presume the forthcoming Kraven the Hunter — run into the same daunting obstacle, which is that it’s hard to build a conventional movie around characters that work best in opposition to the superhero, as a distorting mirror or foil or hurdle. All of them get around that problem by essentially turning their villains into more virtuous, upstanding, or even conflicted versions of themselves… which ends up violating what’s special about them. It’s actually hard to imagine a Venom or Joker movie that embraced the more twisted (or #twisted) aspects of either, because where would the rooting interest lie? You’d have something like The Fly or Natural Born Killers — which, no, that sounds pretty good, actually. What we got instead was de facto superhero movies in supervillain drag.

A man looks ahead in Kraven the Hunter.
Sony

These films evoke the grimdark ’90s in another way, one that should be much less comforting for studio executives. That decade wasn’t just the era when comics were locked in an arms race of excessive edginess, with both Marvel and DC — along with Image, a publisher that was edginess all the time — pushing superheroes into the ethically cloudy arena of antiheroism. It was also a time of boom and bust for the comics industry, when an explosion of big sales and collector investment earlier in the decade lead to a rapid decline in interest, culminating in Marvel filing for Chapter 11 bankruptcy at the end of 1996. Maybe superhero cinema is following a similar trajectory, sputtering out with a run of stories for the tortured bad boys of their roster. At the end of the parade, the rapscallions briefly take the spotlight.

Superman and his dog look at Earth from space.
James Gunn / X

But in the words of one of the genre’s biggest and best hits, maybe the night is darkest before the dawn. Which is to say, maybe there’s a glimmer of something brighter on the horizon, past these (mostly unsuccessful) flirtations with the dark side of the superhero industrial complex. The bad guys had their moment this year. Don’t be surprised if the medium’s most iconic character, a man who puts the super in superhero, kicks off a comeback for the good guys next year.

Venom: The Last Dance is now playing in theaters everywhere. Joker: Folie à Deux is playing in a dwindling number of theaters everywhere. For more of A.A. Dowd’s writing, visit his Authory page.

Advertisement






Source link

Continue Reading

Technology

UnitedHealth admits hack exposed data of 100 million Americans

Published

on

UnitedHealth admits hack exposed data of 100 million Americans

UnitedHealth has admitted that the health data of more than 100 million Americans was exposed in a hack. This is the first time the multinational health insurance and services company, has attributed a specific number to the cyberattack that took place earlier this year.

UnitedHealth admits health data of 100 million US citizens was compromised

UnitedHealth Group (UHG) acquired Change Healthcare in 2022. The two companies are now part of the same healthcare organization under the UnitedHealth brand.

In February this year, Change Healthcare suffered a massive data breach. However, the company did not mention the number of individuals whose data was exposed.

In May, UnitedHealth CEO Andrew Witty indicated that “maybe a third” of all American’s health data was exposed in the attack. A month later, Change Healthcare published a data breach notification, wherein the company merely stated that the ransomware attack exposed a “substantial quantity of data” for a “substantial proportion of people in America.”

Advertisement

The U.S. Department of Health and Human Services Office for Civil Rights (OCR) has updated the “Data Breach” portal. The column for Change Healthcare hack reportedly mentions that 100 million individuals are affected.

Largest American healthcare data breach in recent years

The FAQ section on the OCR website now mentions “On October 22, 2024, Change Healthcare notified OCR that approximately 100 million individual notices have been sent regarding this breach.”

Needless to say, with 100 million American citizens impacted, the ransomware attack could be one of the largest in recent years. What’s even more concerning apart from the number of civilians, is how the data breach was handled.

According to Bleeping Computer, threat actors stole 6TB of data from Change Healthcare. The attackers then encrypted computers on the network. As a remedial measure, the UnitedHealth subsidiary shut down its IT systems. This led to widespread outages in the U.S. healthcare system.

Advertisement

The BlackCat ransomware group, which conducted the attack, may have received about $22 million from UnitedHealth Group. The company allegedly paid to receive a decryption key and ensure the ransomware group deleted the stolen data.

The affiliate that worked with the ransomware group didn’t delete the data immediately. However, the entry for Change Healthcare has mysteriously disappeared from the affiliate’s website. This suggests UnitedHealth may have paid a second ransom demand.

It is not clear how UnitedHealth will be penalized. T-Mobile recently paid a paltry fine of $31.5 million for multiple data breaches. The carrier will get half the money to invest in tech to improve cybersecurity.

Source link

Advertisement

Continue Reading

Technology

Google’s Pixel Tablet is up to $110 off right now

Published

on

Google's Pixel Tablet is up to $110 off right now

Update 10/26/24 9am ET: The deal below has expired, but you can get a similar deal on the Pixel Tablet at Wellbots right now. The Pixel Tablet with its charging speaker dock is $110 off and down to $489 when you use the code ENGPIX110 at checkout. You can use the same code to get $110 off the 256GB Pixel Tablet on its own, bringing the final price down to $389.


Tablets might be a cheaper alternative to laptops but they can still cost a good chunk of money. Sales make all the difference and, right now, the 128GB Google Pixel Tablet is available for $275, down from $399. The 31 percent discount brings this tablet to a new all-time low price. The sale is only available if you get the tablet in Porcelain and doesn’t come with the speaker dock (though that combo is 11 percent off).

Google released this Pixel Tablet in summer 2023 and gave us things we really liked and others we weren’t wowed with. We gave it an 84 in our review thanks, in large part, to its smart home features. Our reviewer, Cherlynn Low, already had a Nest Mini in her room, but was impressed with how much better the tablet worked. The sound is great — though that was thanks to the Speaker Dock — and its Hub Mode is very useful. It shows you all the devices throughout your home, including camera feeds and switch lamps.

If you want this device for entertainment and ease then it could be great. However, there were a few aspects that we weren’t as keen on. Some of the movements aren’t very intuitive and we didn’t use it much without the stand. But, it has great battery if you do get it without a dock, lasting 21 and a half hours with 50 percent brightness on our test.

Follow @EngadgetDeals on Twitter and subscribe to the Engadget Deals newsletter for the latest tech deals and buying advice.

Advertisement

Source link

Continue Reading

Technology

Velan Studios readies launch for Bounce Arcade VR game for Meta Quest

Published

on

Velan Studios readies launch for Bounce Arcade VR game for Meta Quest


Velan Studios announced its VR pinball game Bounce Arcade is available for pre-order on Meta Quest and will debut on November 21.

In the game, you enter the machine and play pinball like you’ve never seen before. It has multiple themed pinball tables full of challenges. Players must strive to survive, unlock mini-games, and chase high scores in action gameplay.

Velan Studios, started by brothers Guha and Karthik Bala in 2016, is known for a wide range of game types with mostly original intellectual property. It also launched the Hot Wheels: Rift Rally title that used augmented reality and real toy cars.

Asteroid Outpost table in Velan Studios’ Bounce Arcade.

Bounce Arcade mixes pinball classics like multiball with new ball control mechanics like “attract” and “guide“. Bounce Arcade enhances the pinball experience with mini-games only possible in VR. For pre-ordering the game, players will receive a 10% discount off the game’s $20 price, bringing it down to $18.

In Bounce Arcade, players will enter the machine and be transported into an immersive 3D table arenas featuring classic pinball-inspired mechanics like rails and bumpers, while navigating level specific mechanics.

Advertisement

Players will start with a set number of balls, and, using their paddles, they must keep the ball in play to rack up points. Players can use paddles to deflect or attract balls and even control balls in mid-flight with the “guide” mechanic to direct the ball to their target.

Gunpowder Gulch table in Velan Studios’ Bounce Arcade.

In Bounce Arcade, players are transported inside immersive 3D table arenas, featuring classic pinball-inspired mechanics like rails and bumpers, along with new tricks and secrets to uncover. Each of the table arenas has a distinct theme and presentation, including the space themed Asteroid Outpost and the western themed Gunpowder Gulch, plus two more to be announced at a later date.

The game keeps players on their toes as the difficulty can increase with multi-ball game play. Certain events within a level will cause multiple balls to be added to the environment which creates a period of more intense action.


Source link
Continue Reading

Trending

Copyright © 2024 WordupNews.com