The nice thing about working with a REST API is that you can implement it in any project. Many researchers tend to use Python as it has powerful tools for data science and is a relatively easy language to learn. This blog post will walk you through the process of setting up your Python project to use the Neural Cloud REST API.
Create an API key
The first step is to go to the admin console and create an API key. You will have to create an account if you have not done so already.
Python code
First we will import the required modules. Both requests and time are built into Python, so you will not need to download any extra Python modules.
1import requests
2import time
Next you will add your API key and the EDF file you wish to upload.
# Add your API key and path to the file you want to upload
API_KEY = 'ADD_YOUR_API_KEY_HERE'
FILE = 'PATH_TO_YOUR_FILE'
Next we will upload our EDF file to the Neural Cloud. Here we construct our HTTP POST command, which includes our API key and EDF file. We then call the post request and get the response back as a JSON.
1# Upload ECG file
2# Construct HTTP POST command
3url = 'https://api.theneuralcloud.com/api/v1/ecg_wave_analysis'
4headers = {'Authorization': f'Bearer {API_KEY}'}
5files = {'file': open(FILE, 'rb')}
6
7# Send command to upload file
8print(f'Sending file {FILE}')
9r = requests.post(url, files=files, headers=headers)
10
11# Get response from server
12data = r.json()
The JSON will look something like the following:
1{
2 'job': {
3 'id': 1,
4 'type': 'ecg_wave_analysis',
5 'status': 'queued',
6 'credits': 70,
7 'received_at': '2024-06-05T01:04:08.885Z',
8 'completed_at': None,
9 'options': {},
10 'error': None,
11 'result': None,
12 'output_files': []
13 }
14}
The important elements here are the job ID and status. The next step will be pulling this information out of the JSON.
1# Get job id
2job_id = data['job']['id']
3
4# Get job status (queued)
5job_status = data['job']['status']
Next we can query the server and check the status of the job. The following code will construct our HTTP GET command to check the status of our job. It will then create a loop that checks the status every 5 seconds. Once the job is complete the loop will stop.
1# Construct HTTP GET command to check status
2url = f'https://api.theneuralcloud.com/api/v1/jobs/{job_id}'
3headers = {'Authorization': f'Bearer {API_KEY}'}
4
5# Run loop checking status
6while job_status != 'completed':
7 print(f'Current status {job_status}')
8
9 # Sleep so we are not constantly calling the server
10 time.sleep(5)
11
12 # Get status
13 r = requests.get(url, headers=headers)
14 data = r.json()
15 job_status = data['job']['status']
When the job is done we can print the results.
1# Print the results
2print(data)
The results will be an object that looks like the following:
{
'job': {
'id': 1,
'type': 'ecg_wave_analysis',
'status': 'completed',
'credits': 70,
'received_at': '2024-06-05T01:04:08.885Z',
'completed_at': None,
'options': {},
'error': '',
'result': {},
'output_files': [
{
'filename': 'wave_analysis.csv',
'url': 'https://....'
}
]
}
}
Included in the results will be a link to CSV file.
1csv_url = data['job']['output_files'][0]['url']
Using a similar request method as before you can download this file. You can also open this URL in a browser and download it from there.
1# Get URL to CSV
2csv_url = data['job']['output_files'][0]['url']
3
4# Download the file
5r = requests.get(csv_url, allow_redirects=True)
6
7# Write the file to disk
8open('waves_analysis.csv', 'wb').write(r.content)
You can also load the file directly into something like Pandas.
1import pandas as pd
2import io
3
4r = requests.get(csv_url, allow_redirects=True)
5df = pd.read_csv(io.StringIO(r.content.decode('utf-8')))
Full code
import requests
import time
# Add your API key and path to the file you want to upload
API_KEY = 'ADD_YOUR_API_KEY_HERE'
FILE = 'PATH_TO_YOUR_FILE'
# Upload ECG file
# Construct HTTP POST command
url = 'https://api.theneuralcloud.com/api/v1/ecg_wave_analysis'
headers = {'Authorization': f'Bearer {API_KEY}'}
files = {'file': open(FILE, 'rb')}
# Send command to upload file
print(f'Sending file {FILE}')
r = requests.post(url, files=files, headers=headers)
# Get response from server
data = r.json()
# Get job id
job_id = data['job']['id']
# Get job status (queued)
job_status = data['job']['status']
# Construct HTTP GET command to check status
url = f'https://api.theneuralcloud.com/api/v1/jobs/{job_id}'
headers = {'Authorization': f'Bearer {API_KEY}'}
# Run loop checking status
while job_status != 'completed':
print(f'Current status {job_status}')
# Sleep so we are not constantly calling the server
time.sleep(5)
# Get status
r = requests.get(url, headers=headers)
data = r.json()
job_status = data['job']['status']
# Print the results
print(data)
# Get CSV url
csv_url = data['job']['output_files'][0]['url']
print(csv_url)