Customer Review of Equalum: "Keeps the source and target synchronized at all times"

by Caroline Maier

Hero Image
March 16, 2021 6:17am


Sometimes our customers say it best. Check out what one of them had to say about Equalum's capabilities. Interested in seeing how Equalum can modernize and transform your Data Integration?


**Disclosure: IT Central Station contacted the reviewer to collect the review and to validate authenticity. The reviewer was referred by the vendor, but the review is not subject to editing or approval by the vendor.

What is our primary use case?

We use it for replication. We have databases and SQL Server. There is some data that needs to go to Oracle for the application team because the application is connected to Oracle Databases, but the back-end application is connected to SQL Server. Then, we create workflows. where SQL Server is the source, Oracle is the target, and all the tables in SQL Server replicate to Oracle. We have 59 flows for five Databases. These go into production, development, and staging multiplied by three. This is how many flows that we have.

How has it helped my organization?

There are applications which have stopped supporting Oracle. Now, the entire application is being migrated to SQL Servers. The entire application data comes into SQL servers, but because the other applications are still linked to Oracle data, they still need Oracle. So, initially when we didn't have Equalum, we used to write Python scripts to pull data from a SQL Server and put it in Oracle, but the Python script requires a lot of maintenance and development. Also, if there were any problem, you needed to have the development knowledge to go and change the Python script.

Since getting Equalum, the data has been flowing very fast. I don't have any knowledge of Python scripts, but still I can create flows. The data streams very well. The data is in synchronization as well. The notification system is good. So, if there are any problems in SQL Server or Oracle, Equalum notifies us that there is a problem, then we can go and check the problem on our end. If the problem is on their end, we have the ticketing system, which is very good. You can open the tickets. If it is a critical production issue, then you open a ticket and they respond very quickly.

Initially, there was a big development team running the Python scripts. Now, we don't need to hire anyone extra. As an SQL Server DBA, I take care of it. We also have a help desk team who takes care of it. With all the people who are using Equalum, it does not need any extra support, hires, or resources.

Overall, Equalum has resulted in a lot of system performance improvements in our organization. It has helped us out by keeping the source and target synchronized at all times.

What is most valuable?

It has good features. It has a replication feature that is wonderful because the data is streaming live and we can change the pulling rates. Initially, this took 50 seconds. However, whatever changes happened from SQL Server to Oracle, they now happen within 30 seconds when it is pulled via Equalum.

The Equalum tool is a good development tool and user-friendly as well. The front-end is user-friendly because it has a nice, easy methodology. It takes hardly a day to teach someone who can then create the workflow. Once the workflow is set, you don't have to do anything. The data constantly flows from SQL Server to Oracle, i.e., the source to the target.

It has a strong command line feature. So, if it is front-end, like in SSIS, then I have to create each flow manually. However, in Equalum, we can write a command line program and deploy 50 to 100 flows together at once through the command line.

Equalum provides a single platform for the following core architectural use cases: CDC replication, streaming ETL, and batch ETL. The CDC is important for me as an SQL Server DBA. So, if there is no CDC, then all my data has to be pulled directly from my tables, which then have to already be linked to the application. So, there will be a performance hit. Now, because there is CDC, the change data captured goes into the CDC table and Equalum pulls from that CDC table. Therefore, there is no user impact on my DB servers.

They have something called binary logs for Oracle. If you have these logs in place, then you can pull the data through the logs. That is convenient because you can pull the big data in through batch processing, which I have not personally used myself. Though I have seen, in my organization, people using batches because they can schedule them. While my data is live streaming and keeps on streaming every three minutes, some data doesn't require live streaming. So, every day in the morning, after I pull the data from source to target, then they can use batch processing, which is good.

It is important to me that the solution provides a no-code UI, with Kafka and Spark fully-managed in the platform engine, because then I don't have to take care of anything. There are no backup problems. For the flows that I create, I don't have to make a backup, restore or maintain them. I just need to create the workflow from my end. I need a user in my source and a user in target from the database perspective. Then, the front-end is taken care by Equalum to Kafka, which makes it very user-friendly.

When we are taking the data from the source to target, we can add fields, like timestamp. So, data accuracy is very prompt and 100 percent. Whatever data you have in the source, that is the exact data reflected in the target. For the many months that I have been using it for all my projects, I haven't found any data discrepancies, etc. There has not been a time when the source of data is different from the target data, which is very good.

What needs improvement?

Right now, they have a good notification system, but it is in bulk. For example, if I have five projects running and I put a notification, the notification comes back to me for all five projects. I would like the notification to come back only for one project. They are working on this improvement because we told them about it. There are the small changes that we keep on asking from them, and they do them for us. If you want features or to modify it, they help us with that. So, the team is on it at all times.

For how long have I used the solution?

I have been using it for six months. The company has been using the solution for six to seven years.

What do I think about the stability of the solution?

It is robust. The stability is good. Long-term, it is a nice, strong tool.

What do I think about the scalability of the solution?

We have multiple nodes. For failover, the data fails over to another node, then it is distributed. Initially, when we started Equalum, it was only one project with 59 flows. Now, we have 400 to 500 flows. It is easily scalable. We didn't have to do much on our side for scalability purposes.

If my number of loads were 50 initially, but now I am running 500 flows, then we are bringing in more applications to SQL Server as Oracle support is stopped. The more data that comes into SQL Server, the more streaming we have to do to use Equalum. We are talking about huge scalability. For the users, we don't have to do much. Instead of seeing 50 flows on the screen, I see 500 flows on the screen. However, from behind the scenes, I think Equalum has to give us more resources.

How are customer service and technical support?

If there is anything that we want to change, we go to the Equalum team. The support is wonderful. They came back to us, giving us a demo on how to use it. They were very nice in that way. They respond very quickly. Their support is very good.

They keep giving us more training on how to use Equalum. The Equalum team comes in and tells us about new features. We have a meeting where they talk with us every week.

When I used to stream the flow, from the source to target, if something changed or stopped working, then I would bring my entire source to the target as brand new. This is called re-streaming. When I used to re-stream, it would take a lot of time. Now, they have done new upgrades. In those upgrades, the re-streaming is very fast. Also, previously they didn't have this re-streaming feature on the front-end. Wherever re-streaming had to be done, it had to be done from the command line. Now, they have brought the feature of re-stream to the front-end. These are the two very good features that they have done for us recently.

Which solution did I use previously and why did I switch?

We used Python scripts previously. Heavy development on the Python side was needed. Also, it needs a developer experienced in writing Python scripts. They must have that understanding. Plus, maintenance also needs to be done through a developer. Because Equalum is a UI tool, you can do so many things. It is a good tool to use too. It's like a tool versus a script. Obviously, you will prefer the tool,

I like the overall ease of use of the solution’s user interface very much because I was a heavy user of SSIS before, which was the only ETL tool that I have used before for data warehousing. When I came to this company six months back, I got introduced to Equalum. I find Equalum very good because it has multiple sources and targets. There are quite a bit of very good options, like SQL Server to Oracle, then SQL. As long as the source and the target have Java Database Connectivity (JDBC), they can be replicated. The tool is very simple to use. The command line takes time for you to understand, but once you understand it, then it is easy-going. The front-end is very user-friendly, so there aren't any issues.

How was the initial setup?

The initial setup was straightforward. There is nothing complex. Obviously, there were commands that I didn't know to write first. They helped me to understand the commands. Once you understand the commands, using the command line and front-end, then it is all straightforward. There are no hidden complexities.

They have good documentation. Yesterday, I was asking the Equalum about something, so they sent me the documentation for that. The documentation is well-detailed. They have videos supporting it. If there is a new feature coming out, or any new training you want to do, then they have videos in place. The videos are very good. So, you can review the code and follow the video, then do your work.

If it is a SQL Server, then as a DBA, I have to enable CDC and make sure there is a user with proper privileges. Then, if I have Oracle, I need a user over there with proper privileges, based on what they have given us in the documentation. Once all this is ready on my end, then it is a straightforward deployment.

Deployment does not take much time. If you do a brand new deployment, it is like half an hour or an hour maximum. When bringing all the tables from source to data for the first time, it takes some time, around five hours maximum, for all the data from the source to the target to stream. Once it is streamed, then it is very quick. If there are very few tables, I have seen deployment finishing in half an hour.

What was our ROI?

The main impact for Oracle LogMiner is the performance. Performance is drastically reduced if you use the solution’s Oracle Binary Log Parser. So, if we have 60 million records, initially it used to take a minute. Now, it takes a second to do synchronization from the source and target tables.

If we were not using Equalum, then we would need to use Python scripts, C#, etc., which need heavy development and more time. Timing is okay, because you only need to write the script one time, then you can use it. However, the maintenance is very difficult. If you don't have someone with the knowledge of Python and C#, then you cannot go and modify the scripts. Whereas, in Equalum, we work with an Equalum support team, and our Flex team also takes care of Equalum. If there is an issue or if they want a flow to be created, they do it themselves. We don't even have to have any scripting or programming knowledge.

Equalum has improved the speed of data delivery more than 50 percent. Python script used to take time to run, then you had to schedule it and take care of the scheduler. Sometimes, for some reason, the scheduler did not work, then your job fails. With this solution, it does not have that issue. This can do live streaming, if you want. Or, if you want batch processing, then you can schedule batches, and it runs.

Which other solutions did I evaluate?

Our team did PoCs and selected Equalum.

What other advice do I have?

We don't use it much for its transformation part. We didn't initially know about the transformation part of it. For example, if I have a new number column in the source and I want to round up the figures or do some string transformation, find, or replace, then I can directly do that from the transformation operators. We obviously used it for replication before. Now, we are using it for transformation as well.

If you want strong replication between any source and target with JDBC, go for Equalum. It's simple, easy to use, and requires less maintenance and tasks to be done. The tool takes care of all your requirements. So, you don't need to do daily backup and restore tasks. It is a straightforward tool. So, if you're using ETL, try Equalum. It is the best bet.

I would rate the solution as 10 out of 10. I have no issues so far.

Ready to Get Started?

Experience Enterprise-Grade Data Integration + Real-Time Streaming