In recent months, I have experimented with Blazor WebAssembly for the presentation part of my website, LudoProgramming.com. The idea was attractive: a front-end written in C#, modern componentization, and an experience similar to JavaScript frameworks. However, after a testing period, I decided to return to classic ASP.NET MVC. The main reason? SEO.
Blazor and SEO challenges
Blazor WebAssembly brings many technical advantages, but for SEO there are some significant challenges:
-
Content is generated client-side – search engines do not always correctly index dynamic content.
-
Complicated prerendering – to deliver meta tags and optimized content for crawlers, integration with ASP.NET Core, server prerendering, and additional configurations are needed.
-
Perceived performance – the first load of the Blazor application involves downloading the framework and DLLs, which affects speed, an important factor for SEO.
Thus, to achieve quality SEO in Blazor, many additional elements must be added: prerendering, dynamic rendering for crawlers, complex caching configurations, and optimization for loading speed.
Why classic MVC is more suitable for SEO
With classic ASP.NET MVC, things become much simpler:
-
Pages are generated directly on the server, with complete HTML content at delivery time.
-
Meta tags can be set directly in views without additional tricks.
-
Search engines immediately receive complete pages and can correctly index each article.
-
Loading speed is better, without the overhead brought by Blazor WebAssembly.
Conclusion
Blazor remains a very interesting technology with potential for complex applications, especially in the area of internal applications or SPAs where SEO is not a priority. However, for a blog or a content site where visibility in search engines is critical, classic MVC remains the simpler and more efficient solution.
In the end, I decided to return to MVC for the public part of the site, to focus on content and SEO optimization, without having to add too many technical layers over Blazor.