In conversation with...
Jeremy Hurwitz

Voice of the Consultant interview by State Street Alpha

Addressing data management challenges with cloud-first solutions

State Street Alpha kicked off their Voice of the Consultant series by interviewing Jeremy Hurwitz, a Managing Director at Elgin White.

Watch the full-length video (12 minutes) to hear his thoughts and advice, as he answers the following questions:

  • What is preventing asset managers from fully leveraging their investment in data?
  • What are the most significant data utilization challenges facing firms today?
  • What role does data infrastructure play in helping firms prepare for T+1 settlement next year?
  • What are the most important data and technology investments that firms should be making in the years ahead?

Video transcript

Frank Smietana: Jeremy, what’s preventing firms from fully leveraging their investment data?

Jeremy Hurwitz: So, I think, number one, I go back through the legacy journey that firms have been on, and what we found is we have 20 years of technical debt that’s been accumulated in what we consider legacy data platforms. We came from a world where everything was application-centric, where people essentially locked their data down in the application and there wasn’t really an externalized view of data. We moved over the ’90s and 2000 to the data warehouse-centric world and at least started to value data as an asset and externalized data and started to disconnect data from being hardwired into the underlying applications.

Unfortunately, as we went through those journeys, those data projects tended to be very costly and did not always generate the return on investment that they promised and took a lot longer than expected. So now we kind of have those legacy data platforms, part of what we might consider the hairball or the spaghetti diagram where we tightly bundled and tightly coupled all of our application and data services. So the first thing is we have to decouple all of that, and obviously, the cloud offers the journey to be able to decouple that, to externalize our data, make our data readily available to all consumers across a wide geographic data platform. So that’s number one. 

Number two, we have an issue with governance, ownership, and control of data, and that sort of sparked the reaction, which for 20 years we spent a lot of time helping firms understand how to govern their business data. And it was a long, difficult journey because we had to build a new organization within firms’ data governance, leadership, chief data officers, data stewardship, and metadata control. So all of that governance control was important for being able to recognize data as an asset and make it front and center to the investment process in that journey. 

We also discovered there was a huge issue with data definitions and transparency. What does the data mean? Where did it come from? Who owns it? All of those things were undefined within an organization, which is a tremendous liability. When you have data as an asset that is undefined. 

So that journey to try to define and decouple and detangle the data is an ongoing journey, the next part of this that’s really important is this idea of creating frictionless data communication capabilities. So this is real-time messaging services, smart workflow processing, where you have pattern recognition, we start to leverage AI into an understanding of how data needs to move around, not having a lot of human intervention and sort of these roadblocks in how data can move in a friction-free world. So we have to break that old file-based API world to become real-time zero-latency and full friction-free messaging, and flow of those, of that messaging. 

And then I think the last piece of that puzzle is the ability to handle really complex data sets. So we need the next generation of technology to combine digital data, spatial data, graphic data, and whatever sort of NTF data is going to be coming down our pipes. 

We need tools and technology around the business intelligence side to be able to combine, analyze, and make predictions and leverage that data into the ultimate investment decision. And I think that’s a long journey that investment firms need to be prepared for. They’ve got to invest in the tools and technology around that stack and move away from this idea that data is an afterthought and that we can let data live and die within our framework without investing in it.

Frank Smietana: What are the most significant data utilization challenges facing firms these days?

Jeremy Hurwitz: So, I think the one thing we see as being the monster is the size and the scale and the and the dimensions of data. As we’ve evolved from a more traditional field attribute level data world, where we worried about getting in security master data, client data, portfolio data, we’re moving now to a massive scale data world where the data can be spatial, it can be digital, it can be media. We have this whole deluge of data that’s coming into our landscape and we don’t have the capability to either store it, process it, analyze it, and make sense of it. And so one of the biggest challenges we have is changing the data utilization patterns of what we’ve done traditionally, which is handle file data, sort of record-based data to moving to streaming spatial digital dynamic data. So we have to retool our whole data infrastructure in order to be able to handle that. We’ve got to retool the way we store, manage, organize, and then utilize that to allow us to interpolate, predict, use AI, use smart, intelligent processing models to be able to make sense of the data. So that’s number one. 

Number two, we’re going to have to build what I call friction-less systems. We have to allow data to move without causing a tremendous amount of overhead and delay in terms of the way the data processing cycles work. So all of that has to transition, how we store the data, how we move it to the cloud, how we open up the sort of access and the connectivity to the data, and then the tools that we have to allow the data to move around. So cloud to cloud-to-cloud integration, cloud-to-terrestrial integration, old legacy, access to the cloud, all those things have to happen. 

And then we need the ability to have tools to allow us to make sense of the data. So the next generation of AI, BI predictability, sentiment, reading type data tools is going to allow us to really be able to utilize this data in a more efficient way. We also have to worry about the digital control, security, and access to that data. So while we’re opening up and allowing data to free flow and use different formats and different technology to interpolate the data, we have to make sure we keep it secure. So we have to fortify our data environments, and we have to follow regulations. So we have these conflicting demands. One is we want open, we want full access, we want full interpretation and use of the data. On the other hand, we have to make sure the data is secure, protected, and maintained historically to meet all of the regulation requirements that we have around the utility of data. 

And then the third thing really is this idea about reimagining how the next generation of consumers wants to be able to digest their data. So we have, I consider the changing of the guard, the Gen X, Gen Y, and Gen Z are the new kids on the block. They’re going to be thinking about data differently. They want to be fed their data through different technology capabilities, through different interface methods. And so we have to reimagine what that experience is like for the next generation of people. 

So one, we have to bring the data in, open it up, and be able to use the tools and technology to interpret all the data components in there. Two, we have to secure it and then three, we have to make it accessible to the next generation of technology consumers.

Frank Smietana: What role does data infrastructure play in helping firms prepare for T+1 settlement next year?

Jeremy Hurwitz: So I’ve been in the industry for 30+ years, not to date myself, but we’ve been talking T1, T0 for the entire duration of my career. Ultimately, the industry has to get there. We have to figure out how to basically break the boundaries of how data moves and how we are able to connect and communicate internally within the firm as it relates to data, specifically transactions, and settlements, and also how we connect externally with counterparties and all the third parties that we that the trade needs to communicate with. 

Ultimately, we see three key components there

One is we have to open up the data model, which means cloud, cloud enablement, and cloud elasticity, all those things that essentially allow us to grow, scale, and openly connect across cloud-native environments so everybody can get access to data without having to go through the laborious sort of checkpoints and boundaries that we used to have traditionally, kind of like the European Union. Once we open up those boundaries, you know, all the transactions can flow.  

Two, we have to create frictionless transaction communication cycles. And that means going into real-time messaging. It means going into blockchain where we can see simultaneous acceptance and settlement of transactions happening in real time. 

And then three, we have to move the people and the process around the technology and the messaging into an automated model. So, we have to get away from people roadblocks. We have to get away from these manual processes that have happened over decades within our industry, and everything has to become frictionless. So, we need open cloud access and architecture. We need real-time, frictionless messaging, and we need agile human processing cycles that allow us to very quickly react to changes in the market, very quickly make changes to transactional cycles, and allow us to sort of stay in a near real-time model that will ultimately be the platform for allowing T+0 to occur.

Frank Smietana: Jeremy, what are the most important data and technology investments that firms should be making in the years ahead?

Jeremy Hurwitz: This falls into the people process and technology side. There are certain fundamental areas that each firm has got to advance themselves in and get to a certain level of competency. 

One is we all know migration to the cloud is critical. And we’ve seen a dramatic change in the last three years in terms of the hesitation to move to the cloud. Private cloud was the product of choice. Now we know that the majority of tier-one firms are using open cloud, AWS, Azure, you know, whatever it is in terms of true cloud-native capability. So that’s one in terms of freeing up the data and the access and the expandability and scalability of the cloud.  

Two is we have to essentially start investing in the next generation of technology, and this is where it’s challenging for asset management firms. We have so many day-to-day problems that we’re struggling to deal with. You know, just the processing cycles of getting through the day, closing the books, opening the books in the morning. But we have to start thinking about leading-edge investments. And we see this as firms having what we call technology test labs. And they have to start thinking about how are they going to get into the next generation of technology. This includes starting to, let’s say, do development with Web 3.0, with blockchain, and with AI. Obviously, we think of business intelligence as something that’s leading edge, but we found that most firms are really lagging behind on the business intelligence side. We might consider business intelligence as client reporting or some really basic analytics that we’re wrapping around our portfolio reporting. But we really have to start breaking through those barriers and getting into the next generation of true business intelligence, leveraging AI, leveraging predictable models, leveraging, smart processing. And that really comes from investing in lab testing. So, picking out new products, doing short agile cycles, 3 to 6 months to test new technology, figure out how you can outperform your competitors because ultimately that’s the game we’re in. We have to think about data as a manufacturing process. If you were in a manufacturing plant, you have to have zero defects. You have to have a high-speed reaction to market changes. You have to produce products that are going to be reliable, predictable, and consumable. And we have to think about a manufacturing process around data, but we have to add all the new leading-edge technology flavors to that. 

And so that really means hiring new young talent. And I think that’s the biggest problem. We have a traditional view of the resource world. I think asset managers, when they go into this lab approach of testing and organizing new tech, they’ve got to go and hire from new frontiers. They’ve got to go hire people who have gaming experience, people who have AI experience, people who have been in the data spatial space. And I think we have to move a little bit away from the traditional operational data people to some of these new tech, talented people and start to expand our horizons.

Frank Smietana: Thanks for sharing your insights with us today, Jeremy.

Jeremy Hurwitz: My pleasure.


The cloud is our journey towards decoupling application and data services, and externalizing our data, making it readily available to consumers across a wide geographic data platform.

Jeremy Hurwitz
Managing Director, Elgin White

Jeremy Hurwitz, Managing Director at Elgin White

Jeremy Hurwitz

Managing Director - North America

Based in Los Angeles, California, Jeremy founded InvestTech Systems Consulting LLC in 1990. As President and Director the Enterprise Data Management and Architecture Practice for more than 25 years he managed over 50 large investment technology, strategy, and implementation projects for many of the largest global investment management firms. Following the acquisition of his firm by Accenture in 2017, Jeremy remained as Managing Director, Asset Management Lead.

Jeremy joined Elgin White in 2022 to lead the firm’s launch into North America. He is recognized throughout the North America asset management community as a thought leader and innovation architect in the design of enterprise data, analytics, and reporting platforms. His deep knowledge, broad buy-side network, and extensive experience deploying business-outcome-focused technology solutions successfully has accelerated the growth of Elgin White in North America.

Stanley Drasky

Managing Director

Stan joined Elgin White in 2020 following an impressive 30-year career working predominantly for world-class buy-side financial services firms in North America and Europe. He is a renowned buy-side technology leader with a reputation for designing and delivering large-scale transformation initiatives at scale and pace. Currently, Stan is especially active leading infrastructure (cloud and managed services) transformation projects.

His most recent roles prior to Elgin White include Chief Information Officer (EMEA) at State Street and various senior positions during his eight-year tenure at Northern Trust Corporation, including Head of IT & EMEA/Global Director of Fund Administration Technology, Global Director of Asset Management Technology, and Director of Trade Execution Technologies. Stan has also held senior Operational and Technology roles at Nuveen Investments and BNY Mellon.

Throughout his career Stan has been a strategic client or partner of every major front-to-back office technology vendor including SimCorp and Charles River Development.

Erik Schutte

Managing Director

Erik founded Elgin White’s consultancy practice in 2019, which transitioned the firm into a full-service buy-side consultancy and resourcing business. He is an experienced front-to-back business transformation professional and has held several senior management positions during his more than 30-year career in the UK and his native Netherlands, as well as Asia Pacific.

Previous roles include Head of Professional Services at SimCorp, Director at PwC UK, and Managing Director for Wealth & Asset Management Services at Accenture UK & Ireland.