Please be aware that the content in SAP SQL Anywhere Forum will be migrated to the SAP Community in June and this forum will be retired.

I have a team I'm working with that is running 10 SQL Anywhere databases.

They are running this on Ubuntu 18.04

SQL Anywhere is version 17.0.11.7312

They are wanting to introduce replication so they can use it to move data to their Azure environment. Long story short, one of the databases is 300GB is size.

Currently they have been extracting for almost 24 hours and it looks like they are about 80% done.

Their machine has 8 cores, and for the most part it has just been using 1 core, with the occasional blip up to 2 cores. It never goes beyond that.

Is there any way to speed this process up? They are more or less throwing their hands up thinking that extractions taking this long would cause issues since they are running isolation level 3.

This is not a disk space issue from what I can tell. They have about 100TB available on the database partition.

Any thoughts or suggestions on what might speed this process up would be greatly appreciated.

Jeff Gibson Nashville, TN

Additional info - We are watching the core usage and SQL Anywhere just jumped up to 8 cores, but it's only using about 12% of each core. But it literally just started using this.

asked 07 Dec '23, 12:30

Jeff%20Gibson's gravatar image

Jeff Gibson
1.9k416273
accept rate: 20%

edited 08 Dec '23, 03:12

Volker%20Barth's gravatar image

Volker Barth
40.2k362550822

When you say "they are wanting to introduce replication" am I correct in assuming that you mean SQL Remote?
When you say "extracting" do you mean running dbxtract against the database?

I'm going to assume yes and answer the question, and will delete the answer if my assumptions are wrong.

(07 Dec '23, 12:49) Reg Domaratzki
Replies hidden

Are there complicated article definitions contained, say with several joins based on subscribe by values? That may slow down extraction heavily.

(07 Dec '23, 12:55) Volker Barth
Replies hidden

My apologies Reg. Yes. They are using SQL Remote. And this is only one way replications. So they are doing sends only from production and receive only at the two other environments.

(07 Dec '23, 13:55) Jeff Gibson

It is not. It's basically a select all saying send everything that happens.

(07 Dec '23, 13:56) Jeff Gibson

I would not suggest you run dbxtract on an in use production consolidated database, but rather that you run the extract on a copy of the consolidated database.

  1. Stop the consolidated database.
  2. Make a copy of the database file to another location, preferably another computer just to be paranoid.
  3. BE CAREFUL NEVER TO RUN DBREMOTE AGAINST THE COPY OF THE CONSOLIDATED.
  4. While the consolidated is still not running, start the DB using dbeng17, and execute the SQL that creates the new remote user, subscribes them to the publications, and then starts the subscriptions. Make sure that the message directory for this new user exists.
  5. BE CAREFUL NEVER TO RUN DBREMOTE AGAINST THE COPY OF THE CONSOLIDATED.
  6. Start the consolidated database again. As the database is being used and data is being added, when you run dbremote against the production version of the consolidated (NOT THE BACKUP!!), it will create messages for the new remote user to eventually apply.
  7. BE CAREFUL NEVER TO RUN DBREMOTE AGAINST THE COPY OF THE CONSOLIDATED.
  8. Extract the remote user from the backup copy of the consolidated database. Nobody else is using this database, so the resources it is consuming or the time it takes is irrelevant.
  9. BE CAREFUL NEVER TO RUN DBREMOTE AGAINST THE COPY OF THE CONSOLIDATED.
  10. You can now delete the copy of the consolidated database, which will ensure that dbremote cannot be run against this database, because, you know, it no longer exists.
  11. Deploy the newly extracted remote database and run dbremote against the remote database. When dbremote runs, it will pick up messages from the consolidated database that include all the changes that have occurred on the consolidated since the copy of the consolidated database was taken in step 2.
  12. You now have two database that are perfectly in synch, and the only downtime was the time to physically copy the database file of the consolidated database.

Reg

permanent link

answered 07 Dec '23, 13:24

Reg%20Domaratzki's gravatar image

Reg Domaratzki
7.7k343118
accept rate: 37%

Your answer
toggle preview

Follow this question

By Email:

Once you sign in you will be able to subscribe for any updates here

By RSS:

Answers

Answers and Comments

Markdown Basics

  • *italic* or _italic_
  • **bold** or __bold__
  • link:[text](http://url.com/ "title")
  • image?![alt text](/path/img.jpg "title")
  • numbered list: 1. Foo 2. Bar
  • to add a line break simply add two spaces to where you would like the new line to be.
  • basic HTML tags are also supported

Question tags:

×247
×103
×8
×5
×5

question asked: 07 Dec '23, 12:30

question was seen: 288 times

last updated: 08 Dec '23, 03:12