Welcome to insideBIGDATA’s “Heard on the Street” round-up column! In this regular feature, we highlight thought-leadership commentaries from members of the big data ecosystem. Each edition covers the trends of the day with compelling perspectives that can provide important insights to give you a competitive advantage in the marketplace. We invite submissions with a focus on our favored technology topics areas: big data, data science, machine learning, AI and deep learning. Enjoy!
How Successful Companies Extract Value from Data. Commentary by Kausik Chaudhuri, CIO at Lemongrass
“In today’s digital era, big data is highlighted as a driver of business innovation and value. Globally, enterprises across various sectors are overwhelmed with massive, constant amounts of information flowing from various sources such as customer interactions, IoT device engagement, internal business operations processes, and even social media. Such a wide variety of information, irrespective of source, such as legacy data or analytics from recent engagements, can hold the key to supporting insightful business intelligence, predictive modeling, and the potential to optimize business operations. However, the deluge can also intimidate organizations when attempting to harness its potential. This challenge is not only to collect the data but also to sift through the often unstructured data and extract actionable insights that can lead to desirable business outcomes.
To help enterprises unlock the immense value hidden in both legacy and newly collected datasets, specialized IT services have evolved to support the mining process. These services offer sophisticated data analytics and machine learning tools to process and analyze data at scale. Data engineers and scientists work in tandem to structure chaotic or mismatched data, untangle inconsistencies, and develop robust data pipelines. These efforts ensure that the data fed into analytics tools is clean, relevant, reliable, and ready for exploration. Once prepared, advanced algorithms and models delve into this data to identify patterns, trends, and anomalies that would be imperceptible to casual observation. For a business, identifying these patterns could mean discovering new customer trends, identifying a potential future delay in supply chains, or even predicting market shifts before they happen. For instance, a retailer might identify purchasing behaviors that signify a new trend, enabling them to adjust their stock and marketing strategy accordingly. Alternatively, a manufacturing firm might spot inefficiencies in their production line and realize a need for retooling, leading to cost-saving adjustments. Essentially, by leveraging IT expertise alongside state-of-the-art tools and aligned business practices, enterprises can transform raw data, either legacy or newly collected, into a goldmine of actionable intelligence, driving business efficiency, innovation, and supporting a potential competitive advantage.”
Integrating AI into the workplace requires a hands-on approach. Commentary by Victoria Myers, Global Head of Talent Attraction at Amdocs
“We’ve seen how AI can provide shortcuts for daily work from writing and sending emails to scheduling meetings to drafting social media posts and much more. In these cases, AI is clearly thriving and its crucial to keep up. However, it’s not enough to simply expect employees to use AI now that it’s more readily available, but set the right boundaries, guidelines, and opportunities to learn more about it.
Before the pandemic, our talent and technology teams saw the need for faster, more flexible operational environments for our customers to keep pace with digital-first players that move at the speed of light. With that, we trained thousands in our workforce on microservices and DevOps to prepare them for this shift. Now, companies need to take the same approach with AI, doing company-wide learnings, workshops and hands-on training to ensure it’s being used safely and effectively.”
Think AI can solve IT’s data problem? Proceed with caution. Commentary by Brett Hansen, Chief Growth Officer, Semarchy
“Organizations are already challenged to successfully translate the vast quantities of data they capture into actionable insights. As the internet of things (IoT) expands and organizations adopt edge technologies powered by AI and machine learning (ML), data generation will skyrocket further, increasing the complexity of effectively utilizing the data. Therein lies the problem: AI cannot solve all your data problems. In fact, without a solid data foundation of organized, high-quality data, AI cannot function as intended — period. AI and ML can only operate upon the information it is provided, and without accurate data, these systems are prone to generate incomplete, misleading or downright inaccurate outputs.
And yet, AI is an advantageous technology with infinite potential in countless business applications. Business and data leaders should be actively developing strategies to leverage AI to accelerate transformation of data into business insights and actions. Organizations should proceed with caution and ensure they use complete and quality data within initial AI initiatives. I urge leaders to implement an exhaustive data cleansing and management strategy that empowers AI to create logical outputs. After all, if you cannot locate and provide cleansed data in your organization’s sprawling systems, how can you expect AI to do so?”
Impact of JDK21. Commentary by Simon Ritter, Deputy CTO of Azul
“JDK 21 is officially released on September 19th and will be the next long-term support (LTS) version of OpenJDK distributions. This is significant as many enterprise users exclusively use these versions in production. In the long term, this will lead to substantial adoption. However, most users will wait six to twelve months before deployment to allow the new features to undergo more real-world testing and updates with bug fixes released. JDK 21 contains a host of new features covering all aspects of the platform: language, libraries, JVM, and tooling.
Probably JDK 21’s most prominent inclusion is Virtual Threads that move from preview to full feature. These are ideal for improving the performance and scalability of applications that use the thread-per-request model, with threads spending significant time blocked on IO. As part of Project Loom, developers have been eagerly anticipating this for some time. For developers, there are various language-level features primarily around pattern matching. Both record patterns and pattern matching for switch become full features. There is also a small but powerful addition of unnamed patterns and variables.
The Vector API (for numerically intensive computations) and the foreign language and memory API (part of Project Panama) continue to be developed in incubator format. Given the aggregate list of new features included since JDK 17, this new LTS release of Java will be attractive to developers and end-users.”
Data Mesh — Why Now? Commentary by Claude Zwicker, Senior Product Manager at Immuta
“Central IT creates bottlenecks that significantly slow down advanced analytics and artificial intelligence (AI) initiatives, and the cloud has unfortunately not solved these challenges. Data mesh – a concept first coined by industry vet Zhamak Dehghani in 2019 – presents a potential solution to this challenge, but many have since been skeptical about its viability and need. Now, the current AI craze has brought renewed urgency to the situation and reignited the conversation around the role data mesh can play.
AI and machine learning (ML) applications have an immense appetite for data, and suboptimal results can stem from incomplete or inaccurate data input. It’s hard to imagine that central IT can give all of this data the attention and care it needs for AI models to be optimal. Data mesh, on the other hand, puts data-related care and decisioning in the hands of individual departments and business users most knowledgeable about that data. It creates a self-organizing mesh in which different groups around the business can come together, define their data requirements, agree on how new data is to be shared, and align on the best ways to employ that data.
This decentralized approach provides fast data access, improves data quality, and better aligns data solutions to business objectives. However, for organizations to really experience the benefits associated with data mesh, perceptions need to shift. Traditionally people have conflated data mesh with an out-of-the-box solution when in reality it is a journey being just as much about people and process as it is about technology. For example, the decentralized nature of data mesh architectures leads to an explosion of data access policies, so scaling the traditional, role-based access control model for each data product without any governance oversight at an enterprise level brings significant complications. Setting up a democracy requires more than just a constitution; it requires the necessary processes for electing government officials down to the regional and state level and keeping those officials accountable. Once those processes are in place, a democracy can support a governing body that is constantly evolving and improving on itself. The same goes for data mesh and its ability to support AI initiatives and cloud migration through a strong and well-organized data governance foundation.”
True Innovation Comes from the Platform Ecosystem. Commentary by Jeremy Guttman, SVP of Product + Engineering at OneStream Software
“Product and engineering teams should not be focused on cranking out products as quickly as possible, but rather on what the value can ultimately be for the customer and end-user. Providing a platform ecosystem is essential for unlocking real innovation because it allows SaaS customers and partners to develop software on our software. When we enable the ecosystem with the capabilities to design solutions, they produce tools that might not have even been on our radar as a priority item for a number of reasons. In turn, our product roadmap can be informed by the activity and capabilities customers request to fuel their innovation.
Ecosystem-driven solutions can be downloaded and configured easily, giving end-users new solutions in a matter of minutes. It drastically accelerates time to value with ML and AI technologies the people can deploy nearly right away, and enables users to extend the value of the investment in the platform.
As the ecosystem grows, the number of use cases and data volumes increase as well. This growth requires data management and data wrangling features that are more advanced. Generative AI opens up new opportunities to enrich existing processes with greater content and information in a seamless and intuitive user experience.”
Gen AI’s accessibility revolution. Commentary by Jiani Zhang, Chief Software Officer at Capgemini Engineering
“As an expert in industrial analytics, I have spent much of my professional career examining the relationship between data analytics and artificial intelligence. This relationship had always been much like a staircase with AI at the top – the aspirational point that enterprises were striving towards. However, to get to the top of this staircase, organizations first had to establish strong foundations for their data collection and analysis capabilities, as well as their connectivity. With generative AI, enterprises suddenly feel as though AI is fundamentally in reach, signifying that its power lies in its accessibility.
The main difference between generative AI and conventional AI is the simplicity of accessing and analyzing data. Previously, there were many cycles spent on data collection (including access and connectivity) and data integrity in order to generate a successful predictive AI model. Today, the creation of a large language model (LLM) is relatively straightforward, and conversational prompts can be used to extract data. The data in the LLM can then be augmented by other externally sourced data, alleviating concerns about having a robust enough data set to create a successful model.
The simplicity surrounding generative AI is what makes it such an accessible and appealing innovation in the eyes of both consumers and businesses. This technology has the potential to transform entire industries, as well as fundamentally alter the ways in which we live and work. Yet, with a laptop or smartphone, anyone can access generative AI in some capacity. This level of access is quite revolutionary and holds promise for the longevity of generative AI technology.”
Microsoft AI Copilot. Commentary by Forrester Senior Analyst Rowan Curran
“Microsoft’s announcement of the unified Copilot experience lays the groundwork for the cohesive, consistent experiences promised by generative AI. Generative AI is only as powerful as the data and systems it connects. By linking the Copilot experience across all its different products, Microsoft is beginning to enable this. Being able to plan and act across different domains of work and life is essential for genAI applications to drive deep and truly transformational capabilities. As this rolls out to customers later this year, the pervasive Copilot vision has the potential to help focus a key employee resource, attention, on the tasks that matter most.“
What do you think of this: What we’ve seen today is Microsoft’s vision of a future with generative AI which has been building over recent months — a future where generative AI is an intrinsic piece of nearly all apps, workstreams, and business processes. In order to make sure that this vision succeeds, Microsoft will need to help its customers and make the best and most responsible use of genAI.
People are incredibly enthusiastic about using generative AI, but most people need help understanding its limits, risks, guardrails, and customizations to help them leverage the technology’s maximum impact in their work and personal lives.”
Leveraging Generative AI for Smart, Personalized Chatbots in Business. Commentary by Samanyou Garg, CEO and Founder of Writesonic
“The current generation of chatbots tends to be rule-based, responding to specific inputs with pre-programmed responses. While these have been useful to a degree, they lack the flexibility and personalization necessary to fully optimize user interactions and enhance user experience.
By integrating generative AI models, like GPT-4, into chatbots, businesses can offer more dynamic, intelligent, and personalized responses. This can be achieved by feeding the chatbot various types of data—such as knowledge bases, PDFs, and URLs—and indexing them in a vector database. Given a new user chat query, the AI can then retrieve semantically relevant text snippets from this database and incorporate them into the GPT prompt, producing unique and tailored responses. Additionally, generative AI can be fine-tuned based on a company’s unique writing style, allowing chatbots to maintain a consistent brand voice during user interactions.
But the potential of Generative AI chatbots extends beyond customer support. These advanced chatbots can significantly enhance user experiences across industries, from e-commerce to real estate. Consider a scenario where a user wants to buy a t-shirt from an e-commerce site. Traditionally, they would need to use multiple filters and browse search results. With a generative AI chatbot, they could simply state, “Hey, I want a black t-shirt with white stripes,” and the chatbot could instantly provide the top 5 matching items.
With their capability to facilitate intelligent, personalized interactions, generative chatbots are poised to redefine customer experience and business operations. As businesses further harness AI’s potential, the rule-based chatbots of today will seamlessly transition into the user-centric, Generative AI-driven chatbots of tomorrow.”
AI isn’t going to change businesses overnight but we need to adapt wisely. Commentary by Romain Nicocoli, co-founder and co-CEO of Pigment
“The current economic climate is shifting faster than ever before and it’s crucial for businesses to stay nimble and adjust quickly to the constantly changing circumstances. This can be challenging when decision-making is based on incomplete, inaccurate, or siloed data. The rise of artificial intelligence has undoubtedly revolutionized various aspects of business operations, including financial decision-making, by making it easier to access and analyze data. Nonetheless, business leaders must strive for balance and avoid relying too much on AI to make strategic decisions.
The key is to cultivate a culture of responsible AI adoption, balance quantitative and qualitative data, and identify and address AI-specific risks, such as bias, data security, and misleading outputs. When faced with high-stakes decisions that require quick action, AI can be a valuable tool for businesses to navigate uncertainty and get information quickly. But while AI can elevate decision-making and productivity, it can’t make the final decision for you. Human oversight will always be required. As we move towards an AI-driven future, it is important for business leaders to ensure that their organizations are well-prepared for this inevitable shift.”
Sign up for the free insideBIGDATA newsletter.
Join us on Twitter: https://twitter.com/InsideBigData1
Join us on LinkedIn: https://www.linkedin.com/company/insidebigdata/
Join us on Facebook: https://www.facebook.com/insideBIGDATANOW