If I build a clients site on my server.... will the search engines recognize the site as duplicate content once I upload it to their host and still keep it on my server?
The best solution for your question is to block the robots from indexing your clients site on your server. You can block that in the robots.txt file in the google webmaster tool too. Else if your site is indexed in search engines then the question of duplicate content certainly creeps in
As Krishna says, add a robots.txt file and disallow all spiders on your dev server. Just make sure that you exclude it when you deploy, or you'll ruin your client's rankings if you copy that robots.txt to their server by accident.