The Prompt That Started It All
"I need you to scrape and extract all game download URLs from the website https://dlpsgame.com/list-all-game-ps4/
Please perform the following tasks:
1. Fetch the webpage content from https://dlpsgame.com/list-all-game-ps4/
2. Extract all URLs from the page, specifically game page URLs and direct download URLs
3. Organize the extracted data into a structured format (JSON, HTML, or Markdown)
4. Save the output to the workspace directory
Please analyze the page structure first, then extract and organize all the game URLs in the most user-friendly format for browsing and accessing the download links."
That's it. One prompt. Everything else was automated by Galaxy.ai.
Technology Stack
Python
BeautifulSoup
Requests
Playwright
JavaScript
HTML5/CSS3
JSON
GitHub Pages
Galaxy.ai
Key Scripts Generated
# fetch_and_generate.py - Main scraper
def fetch_games_from_web():
url = "https://dlpsgame.com/list-all-game-ps4/"
response = requests.get(url, headers=headers, timeout=30)
soup = BeautifulSoup(response.content, 'html.parser')
games = []
for link in soup.find_all('a', href=True):
if ('-ps4-pkg' in href or '-ps4-download-free' in href):
games.append({"name": text, "url": href})
return games
# extract_download_links.py - Download link extractor
def extract_download_links_from_page(url, session):
response = session.get(url, timeout=15)
soup = BeautifulSoup(response.content, 'html.parser')
download_links = {
'mediafire': [],
'1file': [],
'other': []
}
for link in soup.find_all('a', href=True):
if 'mediafire.com' in href:
download_links['mediafire'].append(href)
elif '1file' in href:
download_links['1file'].append(href)
return download_links