A new database has joined Cognetik Data Streams’ destinations library: MongoDB.
Roman writer Vitruvius wrote that Archimedes was enjoying a bath when he realized that as he stepped into his tub, his body mass displaced a certain weight of water. The scientist is said to have jumped out of the bath and run naked through the streets of Syracuse in Sicily shouting: “Eureka, Eureka!” (“I have found it!”). While some doubt exists on the authenticity of Archimedes’ story, our brand new destination connector to MongoDB came out of a small Eureka moment too. While looking for a solution to a specific request, one of our engineers ended up building a MongoDB connector. And since we love giving back to the community, here it is!
The addition of MongoDB is truly a joyful win to our engineering and analyst teams, as complex data migration steps are a frequent drawback to using MongoDB to its full potential.
We’re excited to see that, with the MongoDB Destination Connector within Data Streams, it takes less than 5 minutes to set-up a migration stream. This will work with migrating clickstream or advertising data, as well as Big Query, Microsoft SQL, MySQL, PostgreSQL, RedShift, Snowflake or Teradata warehouses to MongoDB.
Use cases for a MongoDB migration
We’ve experienced first hand some cases in which industry practitioners needed an easy way to import data into MongoDB:
- Transforming or getting data out of relational databases (such as PostgreSQL, MySQL, Microsoft SQL) to the famous NOSQL MongoDB.
- Sending different clickstream, social & advertising sources to a MongoDB database because MongoDB is actually famous for its ability to handle structured, semi-structured, unstructured and polymorphic data, unlike traditional databases which can only handle structured data.
- Migrating data from MySQL, Big Query, Microsoft SQL, PostgreSQL, RedShift, Snowflake or Teradata and MongoDB.
- Automating SQL to MongoDB data migration tasks.
- When needing a simpler way to work with the vast social media content data that can be retrieved from Facebook and Twitter APIs.
- Got a different use case yourself? We’d love to hear how our work helps you!
The main advantages of using Cognetik Data Streams’ MongoDB connector are:
- Integrate cloud or warehouse data to MongoDB with zero coding skills – Seamlessly connect, pull and send to MongoDB data from the top marketing and clickstream sources like Google Analytics, Adobe Analytics, Google Ads, DoubleClick, Facebook, Twitter, or data warehouses such as Snowflake, Teradata, PostgreSQL, Microsoft SQL Server, MySQL, and Amazon Redshift.
- Enterprise level security: Cognetik uses the same security protocols used by Amazon, Microsoft, and Google.
- A cost-effective way to migrate to MongoDB (from zero costs with the Individual Plan, to a customized offer with our Enterprise plan).
- The ability to export custom time periods of advertising or clickstream data as you migrate to MongoDB. Data Streams bypasses the limitations of some marketing platforms of only offering 28 days intervals instead of monthly reports. Specific industry verticals use this feature in order to create customized streams using a Gregorian, Retail or Custom Calendar, specific granularity, and different time zones.
- Automated & scheduled data migration tasks. With Data Streams you can set recurring streams and custom scheduling to automate data extracts. You can define when to start, end or how often the import should run.
- Custom schema mapping
- Inspect data with preview: often times analysts need to wait for their custom import scripts to finish before they can confirm it worked correctly, but Data Streams’ preview feature allows verifying the results on the fly to confirm your stream migration will have the desired outcome
Common MongoDB applications
MongoDB launched as an open source Non-Relational database meant to help developers spend more time on enabling and enhancing their existing apps, and less time on handling data schemas. Over time, it proved to allow faster time to market and lower costs of ownership.
- It permits real-time analytics and aggregation.
- It increases development productivity.
- In some cases, it permits for a more efficient scale, with limited resources
Often times, MongoDB acts as a complement to relational databases – not as a drop-in replacement.
As MongoDB’s explains in its customers’ page, a non-relational database has a wide array of applications:
- Analytics, enterprise social networking, social marketing
- Centralized content management
- For example, Business Insider is powered by MongoDB since its 2009 launch
- The Guardian apparently also uses MongoDB to push interactive features to its users
- For social or mobile networking platforms
- For example, Foursquare’s original application used a single relational database at the beginning, making it hard to scale to many nodes required by a high traffic application. As the app got traction, their engineers found that MongoDB could solve lot of the needs they had at the time, according to MongoDB’s case study.
- Practitioners handling billing, online advertising and user data often end up needing to migrate data from Oracle’s platforms to MongoDB.
In the case of e-commerce website development, its benefits are a faster time to market, more time on front-end, less on back-end, allowing developers and users to be more empowered, faster performance (reads and writes) and thus a superior user experience, as well as better scalability, since the database can meet evolving business needs
What did analysts use before the newly launched Cognetik’s Data Streams connector for MongoDB?
- While a number of more expensive connectors already exist – including Informatica, Pentaho, Talend, FME or Studio 3T, many users would create their own scripts, which transform source data into a hierarchical JSON structure that can be imported into MongoDB using the `mongoimport` tool. Usually, people needing more flexibility over the data format would choose to build a script based on python as the smarter choice. An example of such a script would be Lars Kumbier’s 2017 detailed solution to migrating PostgreSQL to MongoDB
- Many organizations create feeds from their source systems, dumping daily updates from an existing relational database to MongoDB to run parallel operations or to perform app development and load testing.
- Shutterfly used an incremental approach to migrate the metadata of 6 billion images and 20 TB of data from Oracle to MongoDB. They run an existing RDBMS in parallel with the new MongoDB, incrementally transferring production data.
To all these alternatives, we are excited to bring Data Streams as a cost-effective, 5-minute set-up solution!
Give it a try immediately with the Individual Data Streams Package.
Or read our full product release in our release notes.
We’d be greatly encouraged by you spreading the word of our new connector and your experience with Data Streams, on LinkedIn, Twitter, Facebook.