Snowflake Cloud Data Platform

Put your data in one place: Snowflake

Snowflake Data Cloud is the modern data platform, providing a highly scalable cloud-native environment to collect, integrate, transform, and analyze your enterprise’s most important data.

What is Snowflake?

The Snowflake Data Cloud, explained

Snowflake is a Data Cloud; a truly cloud-native software-as-a-service platform for hosting and analyzing your analytical data workloads.  There’s no software to install, no hardware to rack. You simply pick a cloud provider and geographic region and Snowflake deploys all of the necessary servers, networking, and software.

We use Snowflake as the centerpiece of your data platform, and augment it with feature-ful industry offerings like:

  • Fivetran for landing data from all the different SaaS and on-premise operational data systems (think OLTP / traditional application-specific databases)
  • dbt to accommodate our extensive modeling through data pipelines and SQL orchestration, once the data is in Snowflake.  Each transformation results in new tables and views inside the same Snowflake instance; a powerful orthogonality that unlocks some serious analytical capabilities.

A true SaaS offering, Snowflake is priced on usage.  That means you can start building your cloud data platform today, without having to navigate procurement and obtain approval for huge capital expenditures.

With Snowflake, you can start small, answer some key business questions, and then grow as far as you want.

Accelerate your Snowflake Data Cloud

Vivanti’s Accelerator Programs give you the templates to get going faster.

Sure, you could start from nothing and build up your schemas, RBAC policy, and modeling code piece by piece. But wouldn’t you rather start with some templates that propel you towards your data goals much faster?


With Vivanti’s Accelerator Programs, you start with our cultivated set of security policies, schema and database layouts, and more.  Don’t waste time reinventing the wheel; let Vivanti do the heavy lifting for you.

What’s So Great About Snowflake?

Snowflake is our number one choice for analytics data platforms.  First and foremost, it’s a usage-based consumption cost model; starting a small warehouse to answer a few key questions won’t cost that much.  By the time you’re spending at the same level as legacy on-premise data warehouses, you’re running so many insight-generating analytical workloads that you can always justify the outlay.

Above and beyond that, however, here are six features of Snowflake that we think set it apart from the crowd.

SQL Columnar Database Engine

SQL is the lingua franca of the data world, and Snowflake embraces that.  The columnar database engine underpinning Snowflake is perfect for online analytical processing workloads that stretch into the billions of rows.

Scale-out storage

You don’t have to worry about storage any more.  In the modern, cloud-y world, storage is so cheap it’s practically a commodity.  Snowflake’s attitude towards disk is that when you need it, you go get it.  Gone are the days of trying to forecast how many shelves to buy for the SAN in the machine room.

Continuous data protection

Between Time Travel and Failsafe, you never have to worry about losing data.  Failsafe is there, protecting your data from software faults and bugs in Snowflake itself (however rare those may be).  Time Travel lets you undrop tables and undelete rows – effectively insulating from operator error, even after the database session is closed.

Secure data sharing (without copying!)

In traditional, on-premise data warehouses, if you wanted to analyze a data set, it had to reside within the database management system.  That often meant loading all data, or a subset, from third parties – like the US Postal Service, other government agencies, and data brokers – often through ETL. Snowflake drastically changed the game here by allowing a dataset provider to share tables and views with other Snowflake customers.  The best part: the sharee doesn’t have to pay for the storage, since Snowflake never copies the data being shared.

Strong governance & security

With sharing comes a heightened need for data governance and security controls built into the data platform.  With Snowflake, we have several tools in the toolbox for safeguarding your data.  Role-based access control ensures that only authorized users are able to access schemas, tables, and other database objects.  Dynamic data masking enables access to otherwise sensitive data by obfuscating and de-identifying chosen columns.  Row-level access policies allow sub-table granularity privileges, which is perfect for sharing both internally and externally.

Everyone gets a (virtual) warehouse

Commodity compute resources like Amazon’s EC2 have done amazing things for the application delivery and software engineering disciplines.  Snowflake’s virtual warehouse concept does the same for data analytics.  Finally, data engineering gets dedicated processing power to ingest and transform data, marketing can pay to run more intense queries, and finance can guarantee that their reporting finishes on time.

Data Modeling

Turning data → information → knowledge

Modeling is the process of taking raw data – put into the warehouse as-is by something like a Fivetran – and incrementally cleaning, contextualizing, categorizing, and classifying it.  Modeling turns data into information and ultimately knowledge.  At Vivanti, we use a zonal discipline when we model data.


Data is brought into the warehouse with minimal modification, often as JSON (semi-structured), or CSVs (structured).  We use Fivetran for this.


Landed data is tagged with data lineage markers and staged, away from the landing zone, allowing other load processes to resume.


Staged data is filtered and only new data is brought into the pre-modeling phase.  This is often referred to as “changed data capture”.


Textual data gets scrubbed, date formats normalized; values homogenized.  Consistency is enforced at this stage, field-wise.


Data from multiple systems is reconciled and normalized.  It is then aggregated and split back out to form the fact / dimension tables and views.

Some modeling is mechanical and completely disconnected from the nature of the data.  The staging and historicization zones are largely independent of the precise nature of the data.  Once we get to the cleaning zone, however, it takes a deep understanding of the underpinning business processes to model properly.

At Vivanti, we prefer to model every zone from staging onward using dbt, but it is Snowflake’s novel storage model that makes this possible for even the largest of data sets.

Targeted Proof-of-Concept

Sometimes, you just need to show that the whole thing works as advertised.

The promises of the cloud data platform – especially enabled by the power of dbt – are compelling. But, it can be difficult to see just how you get from where you are today, to where you want to go. With a Vivanti targeted Proof-of-Concept, we present a time-boxed, scope- and cost-controlled engagement.The focus? Delivering an executive business use case for a larger, more comprehensive, and more impactful project.


If you’re unsure about how well Snowflake will transform your analytics and business intelligence, get in touch and we can discuss a targeted PoC to help reduce that doubt.

We Can Look After Snowflake For You

Free up your team to focus on driving innovation and uncovering new insight.

Managing a Snowflake instance can be a full-time job. Between auditing role-based access control policies, verifying sizing and capacity for virtual warehouses, checking for contention between different users of the same warehouse, and helping to optimize clustering for query performance, your window for unearthing new insights has shrunk significantly.  More Snowflake instances means more data and better insight, but also brings more work and overhead.

We can take on the day-to-day operational duties of managing your Snowflake instances, freeing you up to focus on what matters: chasing down new data, driving new insight, and solving new business problems.

Our managed cloud data platform offering includes everything from deployment and configuration to monitoring and optimization.

We watch:

    • Data pipeline success / failure
      When data transformation jobs fail, we figure out why and either implement fixes ourselves, or advise on possible courses of action.
    • Cost and consumption
      Snowflake is a pay-as-you-grow model.  This works out really well for production, especially if someone keeps an eye on the budgets and forecast usage.  We are that someone.
    • Ownership and object access changes
      Security is of paramount importance, and the cornerstone of a secure data platform is ownership and role-based access.  We monitor changes to these controls to guard against inadvertent or malicious governance breaches.
    • Warehouse performance metrics
      Are your virtual warehouses sized appropriately?  When should you scale up to a larger warehouse, or scale out to a clustered setup?  We track compute resources to give informed answers to these questions.

Using metrics collected around the clock, we can help optimize your analytical workloads, speeding up ingestion, transformation, and reporting efforts.

We’re cloud automation junkies

Automation is in our bones. We’ve automated everything – from code review tests to entire infrastructure deployments and platform scale-outs – managing thousands of machines and hundreds of thousands of applications.

Your Snowflake data cloud is in expert hands, under watchful eyes.