We have online interactive fleet management website which we want to convert to offline application which can be accesses via WLAN/Wi-Fi. The online website will remain working and will be hosted on main central server. For local offline version- Each location will have the application stored on one local desktop which will act as local server and it can assessed by various users through laptop/desktop or even tablets. Communication to-from main central server- We would like the ability to periodically download the updated master data & code updates from central server to local server. Reversely we also need ability to move the transaction data from local desktop/server to central server to keep it in sync. How can we build this client-server version of website for local use without internet but using WLAN. Is it possible? Constraints and technologies need. Thanks in advance!!!
This sounds like it could be a pretty big project, so you might not find the answers here in a few posts. But I'll throw out a few ideas and things to think about: 1. Your local "offline" versions of your site could most likely just run on the same kinds of servers as your online site, but simply on a Intranet/LAN (over wifi or whatever) instead of over the internet. This is a very common practice. Lots of companies use normal LAMP (Linux, Apache MySQL, PHP) servers to run local/intranet apps. They can even be run from low-end desktop computers connected to a local network. 2. Using version control can help keep your servers in sync. Code changes could be pulled down from a central repository or pushed up as needed. Check out something like bitbucket.org (you can have private repos there). This is a good thing to use anyway. 3. Keeping a database in sync can be extremely tricky. It really depends on the situation and requires some thought to figure out what's best in your case. Some things to think about: Are the "offline" versions simply adding additional data? Or are they changing historical data? If a server makes a change, what exactly should sync to the other servers? What if 2 servers change the same set of data? More info is really needed to give you a good answer on how to accomplish this part of your project. If your needs are simple/small, something like SQLite might be worth looking in to, which you would store as a local file and commit in to your code repository and could easily sync to all your servers. I hope that helps a little.
Thanks for the reply and you are right in you assessment and recommendations. For syncing the code and master data- we will refresh the local machine from central server very seldom, may be only when we have some new functionality. Else we don't see a reason to send feed from central server to local machine. The transfer of info between central and local servers will mostly be one-way, i.e. from local machine to central machine for sending over the transaction data. Now here I would like to understand the best way to move the transaction data from local to central server so that it can be then accessed outside WLAN area as well. Here i would like to know how can I send only the new data lines in transaction history and not any duplicates. What's the best approach- 1. Download delta transactions in plan txt file and upload as batch processing 2. Let the program read the tables and find the new records, and then upload to central server and mark them as successfully transferred or you all gurus can suggest another better methods. thanks again,
I'd make an app using buzztouch - that way it works offline and syncs up again when it gets connected. Still lots of work needed in the development but it would be a more robust solution.
I think you have the right idea. There are a lot of approaches to handle something like this, some more complicated (and more robust) than others. In your case it sounds like you could take a relatively simple approach. 1. Central and local servers could all run their own MySQL or SQLLite servers internally and have their own databases. 2. Every night (or whenever you want) you could have a script that runs (either on the local servers or the central one) and inserts any "new" rows to the central server. Keep in mind that in this scenario, all your local servers would be out of sync with each other (they'd all have different "new" rows throughout the day). 3. Either way you'd need to keep track of what is considered "new" on the local servers. This would most likely involve looking at the timestamps of existing rows to determine that. Keep in mind date/time synchronization between servers would become very important with this method so that you know what is new and what isn't in a simple way. You could also keep track on a server-by-server basis what the last synced row's ID was and start from there. 4. After all the new local server rows have been inserted on to the central server successfully, you could then have all the local servers do a full-sync with the central server and pull down the entire database. This would keep everything synced up and ready for the next day's transactions. Personally I think this wouldn't be a bad way to go, especially if you don't want a ton of work or complicated development. However, it definitely has it's downfalls. Just keep in mind that if your business grows or if you have a real budget for this project, you might want to consider something more robust. Someone might have a better, simpler solution. I'm just throwing ideas out there. Good luck with the project.