AI-Assisted Redevelopment of a Personal Portfolio Website

Personal Project (ryanamoore.com) • 2025

Problem

The existing personal portfolio, built on a Canva template, failed to showcase advanced technical and data skills. The challenge was to redevelop the site into a modern, performant “meta” portfolio that would demonstrate proficiency in both international development and contemporary web and AI technologies.

Methodology

The entire project was executed as a collaboration between myself (as lead architect and developer) and a large language model (as a pair-programmer and research assistant).

  • Data Collection & Analysis: Developed a Python-based web scraping pipeline using Selenium to programmatically collect data on professional engagement from LinkedIn. This involved automating browser interactions, handling dynamic web content, and implementing robust error-handling and resumability to navigate anti-scraping mechanisms and ensure data integrity over a multi-hour scraping process.
  • Modeling & Technique: Architected and built a static-first website using the Astro framework and Tailwind CSS. The process involved extensive pair-programming with an AI agent to generate component code, troubleshoot complex build-chain configurations (e.g., Tailwind v4, PostCSS, Vite), and design a component-based architecture for reusability and maintainability.
  • Communication & Strategy: Led the strategic planning for the project, from initial technology stack evaluation (Astro vs. Next.js, Vercel vs. Heroku) to defining the project architecture. This involved making key decisions to prioritize a high-performance JAMstack approach and designing a data workflow to transform raw scraped data into curated, front-end-ready content.

Outcome

The project successfully transformed my personal brand from a static, non-technical presence into a dynamic, skills-forward “meta” portfolio.

  • Strategic Impact: The site itself now serves as a primary case study, tangibly demonstrating cutting-edge skills in full-stack development, automation, and applied AI, rather than just listing them on a resume.
  • Operational Impact: Created a semi-automated content pipeline where a Python script can be run periodically to gather fresh data on professional engagement, providing a data-driven and easily updatable feature for the website and reducing the need for manual content management.
  • Knowledge Impact: Delivered a production-grade web application using an en vogue tech stack. The process required deep, hands-on debugging of the modern front-end toolchain and practical application of web scraping best practices, including session management and anti-botting countermeasures.