The forum will experience an outage sometime between February 10 at 7:00pm EST and February 12 at 11:59 EST for installation of security updates. The actual time and duration of the outage are unknown but attempts will be made to minimize the downtime. We apologize for any inconvenience.

This is a request for the best target for my research and testing. Here is the situation:

We provide software to our clients and part of that package is a stand-alone SQL Anywhere server on their hardware on their network. The client application is currently installed on each of their computers and it communicates with their server.

One of the ideas we have been considering is providing some standard data to them and possibly even keeping it up to date as it changes. Obviously we would want this added maintenance to be as easy and reliable for us as possible, not having to touch each client database as the data changes. Currently the discussion centers around one or two tables that would be standardized across all of our clients.

I have experimented a bit with pushing the data out to text (usually csv), checking the destination columns, and then importing from the text file, but this seems unnecessarily complicated. Ideally (I think) we would keep a database on our network and update it and that would send the data out to tables within the clients' existing databases. The requirements would not be instantaneous, even an overnight lag time would be acceptable. I have been reading about SQL Remote as a way to "Publish" data to a remote database, but most of the discussion seems focused on publishing to mobile databases rather than another standalone database. Also, the SQL Remote seems to initially require an extract from the parent database, rather than syncing a table between two existing databases.

There are some tools referred to as Replication, but that seems to be more of a backup or failover strategy.

Is there another product or tool I should be evaluating? Is SQL Remote the answer but with some configuration to avoid running two databases at a client site (one of their data and one that syncs with ours). I just need some pointing in the right direction I think.

asked 09 Aug '11, 11:17

Siger%20Matt's gravatar image

Siger Matt
accept rate: 13%

edited 09 Aug '11, 14:32

Though we have been using SQL Remote for years, I would not recommend this here. IMHO, it would tie the standalone databases too strongly to your sample database. (This could be handled, of course, but it would seem a very uncommon SQL Remote scenario). The same would apply to MobiLink, methinks.

Instead, what about using a web client procedure in the client's databases? That procedure could try to connect to your sample database (running as a publicly available database with the builtin web server or with whatever web interface you choose) and would just extract new data as it is available? And it could be run on a regular schedule to check for new contents...

permanent link

answered 09 Aug '11, 17:19

Volker%20Barth's gravatar image

Volker Barth
accept rate: 32%

I would assume from your description that

  1. the data load is not too high and
  2. the data modification rate is quite low and
  3. there are no very complex relations (i.e. no multi-level FKs and the like).

Given most of these assumptions are correct, I would expect that a homebrown "update mechanism" would not be much worse than the sophisticated data synchronization used with SQL Remote and MobiLink.

E.g. it would surely be possible to use this approach to update one or two tables with data - or if it allowed to delete and then reload all "sample data". - I would definetely not use a homebrown approach when having to update single values from different sites in different tables of a complex data model...

(09 Aug '11, 17:28) Volker Barth

Your assumptions are correct. I will see what I can accomplish with the web client procedure and see what questions pop up from testing that.

(10 Aug '11, 10:35) Siger Matt

Good ISV style question!! :)

All the applications we have worked on are commercial style applications for vertical markets where the database resides on the server at said office, and the application is installed on the workstation.

We've also looked at options on how to keep our system tables updated with the latest information. These could be simple rate descriptions, tax rates, surcharge descriptions, etc.

I would say that sql-remote would probably not be the option you would want to use. That's not to say you couldn't. But you would be looking at each database that you sell being a replicating database to your master/consolidated database. You could set up the subscriptions for the tables that you need, but you would be dealing with making sure all your replication information was delivered correctly and I could see that becoming a HUGE pain the rear end.

Mobilink on the other hand could be another option to look at. You can just set up scripts that send data from the consolidated database to to the remotes. That way you're not worried about needing to replicate information from the remotes back to the consolidated. I've personally just studied this, but it seems to be the option when you want to transfer data from a master database to a database that is outside your walls. You could look at possibly using the direct row handling.

Another way might be the simplist but crudest. You could just download a sql script to a particular directory on the server. When the server starts (or some other event occures) you could fire off an event that runs that script from ISQL. You can even run ISQL in silent mode so the user doesn't see it. You could set up some constraits around the table that would keep you from entering data more then once. Heck, you could do all sorts of things.

Feel free to contact me back if you have any additional questions!! Would love to hear what direction you go since we're doing this as well.


Jeff Gibson
Intercept Solutions - Sybase SQL Anywhere OEM Partner
Nashville, TN

permanent link

answered 09 Aug '11, 19:10

Jeff%20Gibson's gravatar image

Jeff Gibson
accept rate: 22%

edited 10 Aug '11, 01:17

Another alternative would be to use a proxy table or remote data access:

That is your client databases would contain a proxy table that is linked with the according table in your public sample database. In general, accessing remote data is quite slow (particularly over a WAN), but if there's just need to access this data now and then, this might be okay. By design, the data would be up-to-date as there's no replication of data here: The clients would directly access your data.

A different approach would be to use a remote STP to copy your sample data and store it locally. That would be somewhat similar to the web client function approach.

permanent link

answered 09 Aug '11, 18:01

Volker%20Barth's gravatar image

Volker Barth
accept rate: 32%

This is not actually an answer but a conceptual question:

AFAIK all SQL Anywhere's synchronization facilities (SQL Remote, MobiLink and Replication Server) do enforce some kind of tight coupling between a central database and a remote one. Basically, the central database has to keep track of the existence and synchronization state of all its remotes.

For two-way synchronization, that's absolutely necessary IMHO - you wouldn't want to synchronize with an unknown counterpart. - Obviously, that implies some kind of maintenance on the central database and the possibility that both sites get out of order and have to be re-synchronized.

For other situations, it's often enough for one database (say db2) to being able to extract some data from another one (db1). If db2 can't connect to db1 for some reason, well, then that's temporarily acceptable, and the imported data will stay not-up-to-date for a while. Usually, db1 does not even have to know that there are other databases that use its data. (And if so, it should not know...) - Apparently, this is a much looser coupling.

So I would like to ask you whether your setup would tend more to the first or the second approach - i.e. how much has your "sample database" have to know about your customomer databases?

permanent link

answered 10 Aug '11, 17:29

Volker%20Barth's gravatar image

Volker Barth
accept rate: 32%

The quick answer is that the sample database, or parent does not need to know about the customer database at all. It could just be treated as the authoritative store that the customer databases pull 1-2 tables from.

It is a bit difficult to answer fully because the need is a new one that we are in the process of defining. The best approach to take is going to be largely dependent on which processes are determined to be most effective and efficient.

The main driver of the project is a situation similar to what Jeff describes above. There are 1-2 tables of information that would be the same for each install, such as a phonebook of standard government offices with their addresses, and phone numbers. Users at each install could add to these, but those changes would not enter the standard list, nor would a user at install A care about custom entries at install B. The tricky part is now that everything is installed, the government issues a new phonebook, and will do so each year. If we are providing that information initially, it would be valuable to keep it up to date, so how would we do that across all of the customers without having to login and update each site.

Much later, for a different goal and project we might look at how customer A could give a chunk of related data to customer B who would manipulate and update it and then pass it back to customer A. Unfortunately this data would not be standardized so there would need to be many checks back and forth to make sure the right data matched up to the right relationships on each side.

Before either can be accomplished though, I needed to have some experience with how two independent databases could push data back and forth over the WAN as the only goal, in order to even establish what our options might be.

(11 Aug '11, 09:38) Siger Matt

Much later, for a different goal and project we might look at how customer A could give a chunk of related data to customer B who would manipulate and update it and then pass it back to customer A. Unfortunately this data would not be standardized so there would need to be many checks back and forth to make sure the right data matched up to the right relationships on each side.

If this is a possible 'stretch goal' down the road, you will want to for sure check out MobiLink synchronization to possibly implement this solution.

Having a 'one-way source' for data is 'simple' to code to without involving MobiLink technology (e.g. Volker's excellent suggestions above). However, once you want to start passing around information back-and-forth between sites, you will want the advanced features of MobiLink synchronization that are built-in to the product. Implementing MobiLink now to do the 'simple push' may give you more options down the road for your development.


(11 Aug '11, 15:18) Jeff Albion
Replies hidden

I fully agree to Jeff's MobiLink suggestion if the "much later goal" is seen as a realistic option.

(11 Aug '11, 17:05) Volker Barth

After much testing and fiddling I did get the SQL remote to exchange data between databases. The noted difficulty is that this would require every customer database to be a spawn of a master database we maintain, or running two databases at the customer site.

At Volker's suggestion above I started fiddling with the web server and client procedures built into the database and was able to come up with something much simpler where the master database has the table available as a select defined in a web procedure. The client database can use a client web procedure inside an openxml call to read the output webpage as rows and columns and then you are free to insert, update, manipulate etc. into the client database.

I had the best results with using SOAP as the output type on the web server database rather than xml. For some reason using XML with openXML did not come in nearly as cleanly as the SOAP with openXML.

Maybe that's why it's called SOAP?

Mobilink will be the next target of testing.

permanent link

answered 24 Aug '11, 15:30

Siger%20Matt's gravatar image

Siger Matt
accept rate: 13%

Another "simple" way to enable "standard data updated by customers" might be to allow them to use a web client procedure to update/add data (or to supply "data modification suggests" that your company would still have commit...). Just in case you still expect only data of a few tables to be "standardized". The web client approach would prevent all that synchronization stuff, i.e. modifications could be done just-in-time.

Don't get me wrong: Synchronization is a great tool - if one really does need it:)

(25 Aug '11, 09:21) Volker Barth

Here's another question for you SIGER MAT.

How often do you envision changes to the "master" database?

I have a suggestion for you to consider. Why not have the client's download the latest updates to their database? Many applications give users the responsibility to "Search for Updates."

So you would have a button, lets say, on some window, the about window, lets say, titled "Check for Updates".

Or you could send the clients an email with a HTTP link in the message which would download an installation program, making the changes to the database.

It all depends on how much and how often you need to update the database, trying to anticipate future needs as well.


permanent link

answered 10 Sep '11, 19:36

MetaHertz's gravatar image

accept rate: 0%

Your answer
toggle preview

Follow this question

By Email:

Once you sign in you will be able to subscribe for any updates here



Answers and Comments

Markdown Basics

  • *italic* or _italic_
  • **bold** or __bold__
  • link:[text]( "title")
  • image?![alt text](/path/img.jpg "title")
  • numbered list: 1. Foo 2. Bar
  • to add a line break simply add two spaces to where you would like the new line to be.
  • basic HTML tags are also supported

Question tags:


question asked: 09 Aug '11, 11:17

question was seen: 1,484 times

last updated: 10 Sep '11, 19:36