So I have just poped in my website to see if I have any errors and boy i found quite a few like about 50 to 70 Errors, I tried fixing but I couldnt fix all of them, as Im really not very good with HTML and the programmer for this website. So does it effect SERP my website is Http://www.parkviewlimo and you can validate it here http://validator.w3.org/ Let me know if it does anyways im still going to get those errors fixed anyways , Thanks!
Depends, usually the spiders can still crawl the web pages effectively. However, your pages might not be formatted correctly when they are viewed by the spider as they should be. Remember, they don't see javascript, images, etc. Use the below link to view how GoogleBot would see your web pages. http://www.smart-it-consulting.com/internet/google/googlebot-spoofer/
Unless your page's code is so messed up that it makes it difficult or near-impossible for the spiders to crawl properly, not really. However, having a valid page does make it easier to maintain when new versions of the four mainstream browsers (Internet Explorer, Firefox, Opera and Safari) and the "niche" browsers that are built using their rendering engines (or in the case of Internet Explorer, plugged in to) come out with new versions (as Firefox and Opera are about to do). There are of course many other reasons to validate your pages, but that would fall under Web development, not search engine optimization.
HTMl does make sense but if you are checking your code by a tool, you still will get a lot of errors in it, no matter you rectify them or not. nobody is perfect in this field.
Eh, I have no problem getting my code to validate. Then again five years experience hand-coding HTML and CSS will do that to a person.
True, but how many of those invalid (and by invalid I mean does not validate against a DOCTYPE) Web sites are costing their owners money?