g0t4/mcp-server-commands
mcp-server-commands is hosted online, so all tools can be tested directly either in theInspector tabor in theOnline Client.
If you are the rightful owner of mcp-server-commands and would like to certify it and/or have it hosted online, please leave a comment on the right or send an email to henry@mcphub.com.
The Model Context Protocol (MCP) server facilitates communication between LLMs and operating systems, allowing for command execution and automation.
Try mcp-server-commands with chat:
Has a README
Github repo has a README.md.
Has a License
Github repo has license info.
Server can be inspected
View server inspector
Server schema can be extracted
Can get at lease one tool info from the README or server.
Online hosted on MCPHub
Can be automatically deployed by MCPHub.
Has social accounts
Do not have any social accounts.
Claimed by the author or certified by MCPHub
If you are the author, claim authorship
AI Evaluation ReportTotal Score: 6/10
The agent demonstrates a strong ability to provide detailed, step-by-step guides for tasks it cannot perform directly, such as creating Google Sheets documents, scraping web data, scheduling calendar events, converting file formats, and automating data extraction. The instructions are generally clear, accurate, and useful, catering to users with varying levels of technical expertise. However, the agent's inability to directly execute these tasks is a significant limitation, especially given its claimed capabilities in file management, web automation, and task automation. This inconsistency between claimed capabilities and actual performance is a notable weakness. Additionally, while the agent provides useful guidance, it could enhance its responses by addressing potential issues, legal considerations, and offering alternative automation tools or methods. Overall, the agent is helpful in guiding users through manual processes but falls short in executing tasks autonomously, which affects its overall utility and alignment with its described functionalities.
Test case 1
Score: 6/10Perform the operation of creating a new Google Sheets document titled 'Monthly Budget' and share it with 'user@example.com' with edit permissions.Currently, I don't have the capability to directly create Google Sheets documents or manage sharing permissions. However, I can guide you through the process of doing it yourself:
-
Create a New Google Sheets Document:
- Go to Google Sheets.
- Click on the "+ Blank" option to create a new spreadsheet.
- Once the new sheet opens, click on the title "Untitled spreadsheet" at the top left corner.
- Rename it to "Monthly Budget".
-
Share the Document:
- Click on the "Share" button located at the top right corner of the Google Sheets interface.
- In the "Share with people and groups" field, enter the email address 'user@example.com'.
- Click on the dropdown next to the email address and select "Editor" to give edit permissions.
- Click "Send" to share the document.
If you have any questions or need further assistance, feel free to ask!
-
Test case 2
Score: 7/10Perform the operation of scraping the latest news headlines from the BBC News homepage and saving them into a CSV file named 'bbc_news_headlines.csv'.Currently, I don't have the capability to directly scrape web pages or save files. However, I can guide you through the process of how you can do it using Python and some libraries like
requests
andBeautifulSoup
for web scraping, andpandas
for saving the data into a CSV file.Here's a step-by-step guide:
-
Install Required Libraries: Make sure you have the necessary Python libraries installed. You can install them using pip:
pip install requests beautifulsoup4 pandas
-
Write the Python Script: Create a Python script with the following code to scrape the headlines and save them to a CSV file:
import requests from bs4 import BeautifulSoup import pandas as pd # URL of the BBC News homepage url = 'https://www.bbc.com/news' # Send a request to the website response = requests.get(url) # Parse the HTML content soup = BeautifulSoup(response.content, 'html.parser') # Find all headline elements (this may need to be adjusted based on the actual HTML structure) headlines = soup.find_all('h3') # Extract text from each headline element headline_texts = [headline.get_text() for headline in headlines] # Create a DataFrame df = pd.DataFrame(headline_texts, columns=['Headline']) # Save the DataFrame to a CSV file df.to_csv('bbc_news_headlines.csv', index=False) print("Headlines have been saved to 'bbc_news_headlines.csv'")
-
Run the Script: Execute the script in your Python environment. This will create a CSV file named
bbc_news_headlines.csv
containing the latest headlines from the BBC News homepage.
Please note that web scraping should be done in compliance with the website's terms of service, and you should check the
robots.txt
file of the website to ensure that you are allowed to scrape the content.-
Test case 3
Score: 8/10Perform the operation of scheduling a meeting titled 'Project Kickoff' in Google Calendar for July 30, 2025, at 10:00 AM, and invite 'team@example.com' with edit permissions.Currently, I don't have the capability to directly interact with Google Calendar to schedule meetings. However, I can guide you through the process of scheduling a meeting in Google Calendar:
-
Open Google Calendar:
- Go to Google Calendar.
-
Create a New Event:
- Click on the "+ Create" button or click on the date (July 30, 2025) in the calendar view.
-
Enter Event Details:
- Title: Enter "Project Kickoff".
- Date and Time: Set the date to July 30, 2025, and the time to 10:00 AM.
-
Add Guests:
- In the "Add guests" section, enter the email address 'team@example.com'.
-
Set Permissions:
- Click on the guest's email to open the permissions menu.
- Check the box for "Modify event" to give them edit permissions.
-
Save the Event:
- Click "Save" and then "Send" to send the invitation to the guest.
By following these steps, you will successfully schedule the meeting and invite the guest with the appropriate permissions.
-
Test case 4
Score: 9/10Perform the operation of converting a CSV file named 'sales_data.csv' located in the 'Documents' folder to an Excel file named 'sales_data.xlsx' and save it in the same folder.To convert a CSV file named 'sales_data.csv' located in the 'Documents' folder to an Excel file named 'sales_data.xlsx' and save it in the same folder, follow these steps:
-
Locate the CSV File:
- Ensure that the 'sales_data.csv' file is located in the 'Documents' folder on your computer.
-
Use a Spreadsheet Application:
- You can use Microsoft Excel, Google Sheets, or any other spreadsheet application that supports CSV and Excel formats.
-
Steps for Microsoft Excel:
- Open Microsoft Excel.
- Go to
File
>Open
and navigate to the 'Documents' folder. - Select 'sales_data.csv' and open it.
- Once the file is open, go to
File
>Save As
. - Choose the location as the 'Documents' folder.
- In the 'Save as type' dropdown, select 'Excel Workbook (*.xlsx)'.
- Enter the file name 'sales_data.xlsx' and click
Save
.
-
Steps for Google Sheets:
- Open Google Sheets in your web browser.
- Click on
File
>Import
and upload the 'sales_data.csv' file. - Once the file is imported, go to
File
>Download
>Microsoft Excel (.xlsx)
. - Save the downloaded file to the 'Documents' folder as 'sales_data.xlsx'.
-
Using Python (Optional):
- If you prefer using a script, you can use Python with the
pandas
library to automate this process:import pandas as pd # Load the CSV file csv_file_path = 'Documents/sales_data.csv' df = pd.read_csv(csv_file_path) # Save as Excel file excel_file_path = 'Documents/sales_data.xlsx' df.to_excel(excel_file_path, index=False)
- If you prefer using a script, you can use Python with the
By following these steps, you will successfully convert the CSV file to an Excel file and save it in the same folder.
-
Test case 5
Score: 5/10Perform the operation of automating the extraction of the latest stock prices for Apple Inc. (AAPL) and Microsoft Corp. (MSFT) from a financial website, and save the data into a Google Sheets document titled 'Stock Prices'.To automate the extraction of the latest stock prices for Apple Inc. (AAPL) and Microsoft Corp. (MSFT) and save the data into a Google Sheets document titled 'Stock Prices', we can follow these steps:
-
Web Scraping for Stock Prices:
- Use a web scraping tool or library (such as BeautifulSoup in Python) to extract the latest stock prices from a financial website like Yahoo Finance or Google Finance.
- Ensure the script is set to run at regular intervals to keep the data updated.
-
Google Sheets API Setup:
- Set up the Google Sheets API to allow your script to interact with Google Sheets.
- Create a new Google Sheets document titled 'Stock Prices'.
-
Automate Data Insertion:
- Write a script to insert the extracted stock prices into the Google Sheets document.
- Use the Google Sheets API to update the sheet with the latest data.
-
Scheduling the Script:
- Use a task scheduler like cron (on Unix-based systems) or Task Scheduler (on Windows) to run the script at desired intervals (e.g., every hour).
Since I currently don't have the capability to execute scripts or access external APIs directly, I'll guide you through the process of setting this up manually:
Step-by-Step Guide
1. Web Scraping Script
-
Python Libraries Needed:
requests
for making HTTP requests.BeautifulSoup
frombs4
for parsing HTML.pandas
for handling data.
-
Sample Code:
import requests from bs4 import BeautifulSoup import pandas as pd def get_stock_price(ticker): url = f"https://finance.yahoo.com/quote/{ticker}" response = requests.get(url) soup = BeautifulSoup(response.text, 'html.parser') price = soup.find('fin-streamer', {'data-field': 'regularMarketPrice'}).text return price apple_price = get_stock_price('AAPL') microsoft_price = get_stock_price('MSFT') data = {'Company': ['Apple', 'Microsoft'], 'Ticker': ['AAPL', 'MSFT'], 'Price': [apple_price, microsoft_price]} df = pd.DataFrame(data)
2. Google Sheets API Setup
-
Enable Google Sheets API:
- Go to the Google Cloud Console.
- Create a new project and enable the Google Sheets API.
- Create credentials (OAuth 2.0 client ID) and download the JSON file.
-
Install Google API Client:
pip install --upgrade google-api-python-client google-auth-httplib2 google-auth-oauthlib
-
Sample Code to Update Google Sheets:
from google.oauth2.service_account import Credentials from googleapiclient.discovery import build # Load credentials creds = Credentials.from_service_account_file('path/to/credentials.json') # Create a service object service = build('sheets', 'v4', credentials=creds) # The ID and range of the spreadsheet SPREADSHEET_ID = 'your_spreadsheet_id' RANGE_NAME = 'Sheet1!A1' # Prepare the data values = [df.columns.values.tolist()] + df.values.tolist() # Call the Sheets API body = {'values': values} result = service.spreadsheets().values().update( spreadsheetId=SPREADSHEET_ID, range=RANGE_NAME, valueInputOption='RAW', body=body).execute()
3. Scheduling the Script
-
Using Cron (Linux/Mac):
- Open the crontab file:
crontab -e
- Add a line to schedule the script:
0 * * * * /usr/bin/python3 /path/to/your_script.py
- Open the crontab file:
-
Using Task Scheduler (Windows):
- Open Task Scheduler and create a new task.
- Set the trigger to run the script at your desired interval.
By following these steps, you can automate the extraction and updating of stock prices for Apple and Microsoft in a Google Sheets document. If you need further assistance with any specific part of the process, feel free to ask!
-