An overview of the new Salesforce Data Cloud, Data Lake for Nonprofits on AWS, and Heroku Connect
The most important question to answer before moving to Salesforce is, “What data are you planning to store in Salesforce?”
It seems like a simple, and maybe obvious question, but it is one of the earliest and most important questions to consider when moving to Salesforce, because “everything” is often not the right answer. In fact, trying to use Salesforce to store all your data, past, current, and future, is one of the few reasons Salesforce implementations fail.
Every record stored in the core Salesforce CRM comes with some form of cost. It’s a small cost that can be easy to miss among all the critical decisions that come with implementing Salesforce, but it’s a cost that scales with each row you add. There are three ways to understand this cost:
-
Performance
Nobody likes to wait for a computer. With too much data, page loads, reports, and record save time can slow to a crawl, taking minutes or timing out entirely. While working with an experienced partner can mitigate these performance issues through strong design and planning, being strategic in the data you choose to load is the best and easiest way to set your team up for efficient processes.
-
Organization Complexity
Often, more data means serving more constituencies, supporting more disparate operations, and ultimately, a more complex Salesforce instance. The tradeoff here is nuanced—Salesforce offers enormous value by providing a 360° view of your constituents, but not all data adds the same value to that view, especially if that data also requires substantial customization and automation to support.
-
Money
Salesforce storage costs money. Each license comes bundled with a certain amount of storage, when you use it up, you need to buy more. Buying more data from Salesforce is more expensive per gigabyte than many of the storage options discussed below.
Salesforce is a tool to drive and track relationships and streamline organization processes. While Salesforce can be customized to do almost anything, clearly defining and understanding the role it will play within your organization can help your team avoid the performance and complexity-based problems that come from being everything to everyone.
Designing Your Data Landscape—Questions to Consider
What value does historical data offer to your day-to-day operations and where does that value come from? Does it matter that a donor gave $10 in 2012, or does it matter that their first gift date and lifetime giving totals are accurate? Does it matter that fundraising staff can edit or kick off Salesforce automation based on that data, or can it be accessible in a read-only form?
Think about all the sources and consumers of data across your organization. Is Salesforce the primary producer and consumer of data, or is there an ecosystem of related systems for marketing, finance, advocacy, or other operations to consider? Do you have tools to identify and prioritize the best quality data across those platforms, or does each system store its own limited sliver of a person’s information?
You can think about your data like a pyramid, with the data powering day-to-day operations at the top and with historical data that’s rarely accessed at the bottom. Somewhere on that pyramid is a cutoff point where the data no longer provides meaningful value stored in Salesforce.
For many organizations, this historical data is most efficiently stored in a data warehouse, data lake, or archives. These are terms that often overlap, so it’s worthwhile to define them precisely.
Definitions – Lakes, Warehouses, and Archives
Selecting an Architecture for Your Nonprofit Organization
For nonprofits with one or two key systems, an archive of Salesforce data can offer efficiency and cost savings. Attain Partners has designed and built archive solutions that preserve giving histories, summaries, and even allow fundraising staff to view historical gifts while storing all data off-platform.
For organizations with multiple data systems functioning as both producers and consumers of data, a data warehouse architecture coupled with a Master Data Management plan can centralize complex integrations, streamline operations, and improve data quality across systems. For organizations with mature or growing business intelligence or analytics teams, a data lake can provide data science teams with the flexibility to seek new insights from existing operational data.
Leveraging External Tools to Create a Custom-built Archival Solution
Data lakes, warehouses, and archives don’t typically offer the same polished, flexible user experience as the core Salesforce CRM offers; but, in exchange, they are able to store large amounts of data efficiently and inexpensively. A big piece of this difference is the underlying database product that’s powering the service. Salesforce core CRM uses a traditional Oracle database under the hood, whereas these tools use database products designed and built to store data for some of the largest websites in the world, like Amazon’s RedShift, Google’s BigQuery, and Apache Hadoop.
Salesforce Offerings for Nonprofit Data Storage
Salesforce Data Cloud
The newest offering from Salesforce, Data Cloud, draws upon Salesforce’s marketing-focused warehousing and data transformation product CDP, along with the connectivity of Mulesoft to deliver a data warehouse and data lake. Data Cloud provides tools in Salesforce to manage integrated source data and schemas, an identity resolution service to build a golden list of constituents to match all data sources against, and robust segmentation capabilities far beyond what is offered natively on the platform. Data Cloud works best for organizations with millions, or tens of millions, of rows to a warehouse.
Data Lake for Nonprofits on AWS
Announced at Dreamforce 2022, Salesforce has partnered with Amazon to offer a Data Lake for Nonprofits, which streamlines the 25+ steps necessary to connect and mirror Salesforce data in AWS. A quick start offering, for organizations prepared to own and maintain their own data lake in AWS using RedShift or RDS, this may be the fastest way to get a data lake started in AWS. Like Salesforce Data Cloud, the AWS Data Lake can run with tens of millions of rows of data without issue.
Heroku Connect
Salesforce also offers Heroku Connect, which mirrors the data from entire Salesforce objects in a Heroku Postgres database.
Both Heroku Connect and the AWS Data Lake Connector offer the benefit of streamlined setup and configuration and, given the flexibility of Heroku and AWS, either can be a starting point for a custom-built warehouse or an archive solution. From a pricing and performance perspective, Heroku is best in smaller and mid-market organizations, while AWS offers products like RedShift that handle large data tens of millions of rows without performance degradation. That said, the future for Salesforce seems to be with Data Cloud. At the May 2023 New York World Tour, Data Cloud had at least four booths in various areas of the show floor.
Non-Salesforce Offerings for Nonprofit Data Storage
Snowflake Cloud Database
The Cloud database, Snowflake should also be considered with Heroku and the new AWS Data Lake for Nonprofits, as it offers a cost-effective and performant back-end database, with native Salesforce connectors for specific clouds and some quick start tools to begin setup of your data lake or warehouse. If more flexibility is needed, it can also be connected to Salesforce with a data transformation tool like Mulesoft, Talend, or Informatica.
Civis Analytics
Like Data Cloud, Civis Analytics offers a packaged solution that includes a data warehouse, data quality tools, and lightweight master data management capabilities. Civis Analytics also offers a suite of native connectors to bring in data from Salesforce, EveryAction, and Tableau, along with the flexibility of powerful scripting tools and low-level database access.
Archive Solutions
The Salesforce ecosystem also provides a wide variety of point solutions that focus on a specific use case. OwnBackup, CopyStorm, and DBAmp all offer packaged data archiving solutions. These options are less work to configure and maintain, but may not offer all the functionality nonprofits require, like the ability to maintain giving summaries for key donors.
The Salesforce Data Cloud, AWS Data Lake, Heroku, Civis Analytics, and Point Solutions all provide accelerators to speed their implementation, but each offering is a powerful and flexible tool that can handle great complexity in the hands of an experienced developer, like Attain Partners.
Attain Partners Brings Extensive Nonprofit Data Expertise
Data migration and strategy are part of every Attain Partners project. Our experts conduct enterprise-wide architecture and storage solutions integrated with off-platform storage for higher education institutions and nonprofit organizations throughout North America.
We have successfully helped many colleges and universities integrate Salesforce into their existing enterprise-wide Master Data Management design, often with just the client-provided inbound data feed from existing data warehouses and data lakes.
Most recently, we had the opportunity to work with The ALS Association to help manage their data storage costs and improve system performance by offloading over 10 million records into an off-platform archive. The system was designed to provide fundraising staff a seamless, detailed view of donor giving history across data in Salesforce and in their archive. This new system has empowered the ALS Association to track lifetime giving, average gift amount, and other key giving totals to better communicate with donors and expand their fundraising efforts.
The Attain Partners team values our role in helping our nonprofit clients improve efficiencies and reallocate more of their hard-earned dollars to advancing their missions. Almost every Attain Partners project includes detailed work with data, from designing a system architecture that properly accounts for data volumes and growth, to selecting tools and storage solutions that handle those volumes in a performant and cost-effective way. Attain Partners regularly works with tens of millions of rows of data including larger projects like the data migration we did recently for the San Francisco Public Radio station KQED, reaching the 100 million record mark.
Successful data migration to off-platform data warehouses, data lakes, and archives improves overall system performance for staff and Experience Cloud visitors, while also enabling nonprofit business intelligence and analytics teams to generate a performant picture of organization-wide activity quickly.
If you are considering implementing Salesforce or are already live on the platform and suffering from performance or data-related problems, developing a thoughtful data architecture and strategy based on a strong decision-making framework can be the difference between success and failure for your project and can positively impact your organization’s ability to support donors, fundraise, save money, and spend more time and energy focused on achieving your mission goals. Ultimately, offloading data from Salesforce is an important activity and requires thoughtful energy and a framework for decision-making—Attain Partners is here to walk you through this every step of the way.
Attain Partners – Salesforce Experts
Regardless of whether your organization is just beginning its Salesforce journey or 10+ years into development, Attain Partners is here to help you achieve your goals. Contact our team today to learn how we can help you advance your mission by harnessing the power of Salesforce.
To learn more, check out our Salesforce Innovation services, read case studies about our transformative work, and explore blog posts from our Salesforce experts.
About the Authors
Timothy Fives is the Senior Director of Technology for Attain Partners. In this role, Tim is responsible for providing implementation roadmaps, and technical strategies, and has a passion for emerging technologies that benefit our clients. Tim joined Attain Partners in 2022 and has been implementing enterprise solutions for 20+ years, most recently implementing Salesforce at CRS. Further, Tim is responsible for Attain Partners’ strategic technology team including Attain Partners’ World Team.
Chris Pifer is a Senior Solution Architect serving higher education and nonprofit clients. As a senior leader, Chris provides architecture, design, and subject matter expertise on the technical implementation of Salesforce and relevant third-party tools. His specialties include solution analysis and design, data analysis and migration, and test and quality assurance. Chris has been working in nonprofit tech since 2003 and in the Salesforce ecosystem since 2009. He is one of approximately 200 people who have been recognized globally by Salesforce as an MVP for his expertise, leadership, generosity, and for the positive impact he has had on the Salesforce ecosystem.
Be the First to Know
Subscribe to our monthly Pulse newsletter
to be the first to hear about new blog posts