Big Data News Hubb
Advertisement
  • Home
  • Big Data
  • News
  • Contact us
No Result
View All Result
  • Home
  • Big Data
  • News
  • Contact us
No Result
View All Result
Big Data News Hubb
No Result
View All Result
Home News

Six Ways to Bolster Your Data and AI Team In a Down Economy

admin by admin
June 7, 2023
in News


(pryzmat/Shutterstock)

With persistent inflation threatening to tip the economy into a recession, many companies have taken up defensive positions. While it doesn’t signal the end of big data analytics and AI projects, it may be a sign that companies need to rethink their approaches.

Here are six ways that big data and AI teams can adapt during an economic recession:

1. Focus Relentlessly on Customer Satisfaction

Studies suggest it can cost up to 25 times more to gain a new customer than it costs to retain an existing customer, so focusing on existing customers as opposed to spending money finding new ones makes sense. According to Bain & Company, increasing customer retention rates by 5% can increase profits by 25% to 95%.

(Iurii-Motov/Shutterstock)

If you have a data warehouse, you’re likely already tracking customer satisfaction and related metrics, such as customer churn rates and net promoter scores. There are a variety of ways to employ AI and advanced analytics to bolster these scores, thereby increasing customer satisfaction.

You could find out what your customers don’t like and fix it. Or on the flip side, find out what customers like about doing business with you, and find a way to expand on it. Other areas with low-handing fruit include product recommendations, more targeted marketing campaign, and more analytically powered lead generation, among others.

Doubling down on consumer data could potentially pay dividends during a down economy, according to Near Intelligence, which recently released its “State of Global Consumer Behavior Data Survey.” “Without a steady stream of up-to-date data on consumer behavior, companies are essentially flying blind,” said Steven Williams, chief research officer for Hanover Research, which conducted the survey for Near.

2. Use AI to Boost Employee Productivity

Human resources typically are the biggest cost in a company’s budget, so one great use of AI and analytics is to enable each worker to get more work done.

Stanford and MIT recently published a study called “Generative AI at Work” that looked at the impact that chatbots and large language models had on call center workers. They concluded that AI increased worker productivity by 13.8%–about the same as the cumulative inflation over the past two years, pointed out Bhavin Shah, Moveworks CEO and co-founder.

(Marko Aliaksandr/Shutterstock)

You won’t be able to adapt every position to utilize AI. But with a little creativity and willingness to try new things, you’ll find ways to augment your existing workforce with AI capabilities.

New research suggests companies may be able to use LLMs to completely replace some positions, including data analyst. DAMO Academy, the research arm of Alibaba Group, says GPT-4 beat an entry-level human analysts in terms of performance and demonstrated comparable performance to a senior-level analysts.

“The experiments showed that GPT-4 is not just significantly cheaper than a human data analyst, but also much faster in completing the tasks, according to the study findings,” according to a story published by South China Morning Post (which is also owned by Alibaba).

3. Slash Your Cloud Spending

When your budget tightens, it could make sense to look at one of the items that has grown the fastest in recent years. That’s right–we’re talking cloud spending.

(octocore/Shutterstock)

Since the pandemic started, the public cloud platforms have grown considerably, thanks to healthy demand from companies that no longer want to host a physical data center. In 2021, the analytics firm Anodot found that nearly one-third of data professionals reported their monthly cloud bills went up by nearly 50%, while one-fifth of them had their cloud bills double. A more recent survey from Pepperdata found 57% of decision-makers had significant or unexpected cloud spending.

Nobody likes surprises–least of all the CFO–so naturally the situation has led to new tools to help rein in the cloud spending. Capital One Software developed a new suite of tools called Slingshot based on its experience migrating from an on-prem Teradata environment to Snowflake.

You can also find cheaper compute by using spot instances on AWS. Storage provider NetApp, for instance, has made a business of helping customers adopt these spot instances on AWS, Microsoft Azure, and Google Cloud, which can save upwards of 90% off the cost of compute.

4. Consolidate Your Stack

When the economy is humming and money is cheap, there’s more of a willingness to take risks. That’s true with internal data and AI projects, as well as taking a chance with a startup, whether you’re the founder or using a startup’s products and services.

(optimarc/Shutterstock)

However, when the economy slows, that willingness to take a risk on unproven companies or technologies begins to wane. For some companies, that signals a good time to cut back on tech spending, consolidate on what you have, and regroup for the next push.

“In a tough economy, people go back to what’s working for them,” advises Adam Wilson, a senior vice president at Alteryx and the general manager of Analytics Cloud (formerly Trifacta). “A lot of the science experiments got cut, and I think that’s had a disproportionate impact on a lot of the startups that were out there.”

5. Reconsider the Need for GPUs

Your data scientists might have their eyes on a sparkly new Nvidia GPU. Indeed, since the pandemic, it has been tough to get one’s hands on all sorts of high-end processors, including GPUs, which excel at machine learning workloads and training deep learning models.

The recent surge of interest in LLMs, most of which are trained on Nvidia GPUs, is credited with pushing the Santa Clara, California chip company’s valuation above $1 trillion for the first time. Alas, with so much demand, your data scientist may have to learn to do without.

“Everyone wants Nvidia A100s to run their models,” says Luis Ceze, CEO of OctoML and a computer scientist at the University of Washington, “but there’s simply not enough of them.”

While an A100 may train your new deep learning model the fastest, not having one is not a dealbreaker. There are frameworks in place that separate deep learning runtimes from underlying hardware. Apache TVM, which Ceze helped create, is one example. Another is ONNX, which was spearheaded by Microsoft and Facebook.

You don’t need GPUs to utilize many pre-trained LLMs; in fact, regular CPUs may be good enough to run inference. But there’s also a world of much smaller and easier to train language models, which one doesn’t necessarily need a GPU to work with.

6. Focus on Training and Education

One time-honored tradition for dealing with weak markets is to head back to school. This may not be an option for senior-level AI engineers and data scientists, but it could potentially be a good move for younger folks.

Universities have been ramping up their data programs in response to hot demand over the past decade. The latest is UC Berkeley, which recently unveiled the first new college in half a century: the College of Computing, Data Science, and Society.

A good place to start with a data science back-to-school search is the Academic Data Science Alliance. Led by founder and executive director Micaela Parker, who is a Datanami 2023 Person to Watch, the non-profit serves as a great data science resource for both students and universities alike.

Related  Items:

AI Chatbots: A Hedge Against Inflation?

Venture Capital Funding Plummets, But AI Investment Growing Strong

UC Berkeley Envisions a Data-driven Future with New College of Computing, Data Science, and Society

Vendors:
Academic Data Science Initiative, Alibaba, Anodot, AWS, Google Cloud, Microsoft Azure, Moveworks, NetApp, NVIDIA, Pepperdata, Snowflake, Teradata




Source link

Previous Post

Aaand the New NiFi Champion is…

Next Post

Introducing Amazon EMR on EKS job submission with Spark Operator and spark-submit

Next Post

Introducing Amazon EMR on EKS job submission with Spark Operator and spark-submit

Recommended

What’s New with Data Sharing and Collaboration on the Lakehouse

July 15, 2023

Python Finally Comes to Excel

August 31, 2023

Getting started with generative AI in healthcare and life sciences

August 30, 2023

Don't miss it

News

La mercatique: Marketing and PR the French way

October 4, 2023
Big Data

What Does the Commercialization of Generative AI Mean for Society?

October 3, 2023
Big Data

Cracking the Code: How Databricks is Reshaping Major League Baseball with Biomechanics Data

October 3, 2023
News

Non-JSON ingestion using Amazon Kinesis Data Streams, Amazon MSK, and Amazon Redshift Streaming Ingestion

October 3, 2023
News

ClickHouse Announces Launch of ClickPipes

October 3, 2023
News

A Guide To Data Mining & The Business Benefits

October 3, 2023
big-data-footer-white

© Big Data News Hubb All rights reserved.

Use of these names, logos, and brands does not imply endorsement unless specified. By using this site, you agree to the Privacy Policy and Terms & Conditions.

Navigate Site

  • Home
  • Big Data
  • News
  • Contact us

Newsletter Sign Up

No Result
View All Result
  • Home
  • Big Data
  • News
  • Contact us

© 2022 Big Data News Hubb All rights reserved.