I need some SEO help. I have set up my website (if I am allowed to, I will post the URL or the name) and I had some PDF files there. So, I went to the Google Search Console and did all I could think of. I remember adding a custom file suggested by the website listed steps. I remember setting up some sort of DNS setting. Here is the deal, though. It is a next.js project and the hosting provider is Vercel. Maybe that has something to do with its poor performance. is there some way I can check on this? Anyway. I went through the process of adding links, but the graphs on the Google Search Console shows almost no progress. Please advise. I can post more stuff, like the emails I have been getting. But I guess this is a good start.
That all looks good. There are always going to be odd links that google checks, you later delete but google still keeps trying to check. Even with a sitemap it takes a while for them to disappear. Initially I'd be more concerned with what lightship says (there will be a link within the console) but using next.js and vercel should be a good start. Do look up SEO guides for next.js/react.js because there are some things that are different from php pages.
Thanks for the advice. I’ve also noticed that in Search Console, links or pages often appear and then disappear. I keep thinking that I must be doing something wrong, which is why they aren’t being indexed properly.
Thank you. Yours is the very first reply to my request for assistance. Since you have said that all looks good, I have to conclude that getting "bumped up" in the google search queue is simply something that takes a lot of time and patience. I remember reading somewhere that it is a helpful and good idea to create "backlinks" but I hesitate in posting a link to my website out of fear of getting accused of advertising or committing self-promotion. The little stick figure here has made it clear that this is not allowed. Again, thank you. I have a direction now that you have given me. First I need to check out this "lightship" thing and then check out what specifically I can do about the fact that I am using next.js with vercel specifically.
We do have a section specifically for site reviews, overall, design or seo. You are welcome to post there. One of the first questions you need to ask yourself is "do you deserve to rank higher?". Be brutally honest. If your site is a copycat then the answer is probably no. You need to work on your site to make it better, relevant etc. On the other hand if your site is promoting a generic service (accounting, plumbing etc) it's really hard to make it shine because there's only so much to say. That's where having a technically sound site is important and then work on social media and local data to give yourselves a local edge.
This happens quite often, especially with newer stacks like Next.js on Vercel. From your description, it doesn’t look like you did anything wrong. It’s more likely that Google just hasn’t picked things up yet. Here is what I would check first: Ensure your pages are actually indexable, meaning no accidental noindex or blocked routes If you haven’t already, submit a sitemap Use the URL Inspection tool in Google Search Console and request indexing manually Also, PDFs don’t always behave the same as regular pages when it comes to indexing, which can slow things down a bit. Something that took me a while to understand is that even when the technical setup is correct, Google still needs signals to discover and revisit your pages. If everything is set up properly, it may simply come down to time, along with some initial traffic or mentions to those pages.
If Search Console is set up but you’re seeing no movement yet, that’s pretty normal early on SEO can take a few weeks to kick in. Since you’re using Next.js on Vercel, double-check things like indexing (robots.txt, sitemap) and make sure your pages are actually crawlable.
Yeah, that’s pretty normal—Google can take a while to stop crawling old or removed links. I’d focus more on the core issues flagged in Search Console first. Also agree, Next.js SEO has its quirks, so it’s worth digging into that specifically.
Next.js on Vercel is a fine stack for SEO, the issue is almost never the framework. What's tripping up most new sites in 2026 is that Google's indexing budget for unproven domains has tightened a lot since the last few HCU waves. New domains are taking 6 to 10 weeks to get past the initial soft-indexing phase even when sitemap, robots.txt, and inspection requests are all clean. The thing I'd actually check before chasing technical fixes: pull the Page Indexing report in Search Console and look at the "Discovered, currently not indexed" bucket. If pages are sitting there for weeks, the issue is crawl budget allocation, not your config. App Router with ISR can also serve cached HTML to bots that doesn't match what users see, worth running URL Inspection's "Test live URL" on a couple pages to compare the rendered output. The angle nobody mentioned: your PDFs need their own pass. Google indexes PDFs separately from HTML and they're notoriously slow to surface. If you actually want them ranking, link to them from indexed HTML pages with descriptive anchor text and add structured data on the page that hosts them. Otherwise they'll sit in "discovered but not indexed" for months while the rest of the site catches up.