PySyft Client#

Estimated reading time: 8’

What you’ll learn#

This guide’s objective is to help you understand how to make best use of the PySyft Client, which is the interface to Syft Datasites.

Overview#

PySyft implements a client-server arhitecture, where the client is an interface that allows both data owners and data scientists to interact with a Datasite according to their respective roles.

Generally speaking, a client can take any form - such as a website, a commandline tool, or a library. Given the programatic nature of remote code execution, PySyft’s main client is a Python library, called syft. In the future, we will support more client formats.

How it works?#

The Python client is a collection of user-friendly APIs available on a Datasite server. Depending on who the user is (admin, data scientist, guest), the available API endpoints can be restricted.

To get a PySyft client, you need to login in using an account. In case you don’t already have one, you can self-register if this option is enabled on the Datasite server. More details are available in the Users API (link).

Getting a Python client#

import syft as sy

server = sy.orchestra.launch(
    name="my_special_server",
    reset=True,
    dev_mode=False,
    port='auto',
)
# Logging in using the default credentials
admin_client = sy.login(
    url='localhost',
    port=server.port,
    email="[email protected]",
    password="changethis"
)

Exploring available APIs#

There is a wide range of APIs exposed via the client. You can take a look at them via the client.api object:

admin_client.api

Main APIs#

Whilst this list is extensive, getting started with PySyft requires getting familiar with the following:

  • Orchestra API: ability to launch local development server

  • Settings API: ability to update the server’s settings and customize it

  • Users API: ability to manage the users of a servers

  • Project API: ability to see and inspect projects sent or received by the current user

  • Request API: ability to see and inspect the requests sent or received by the current user

  • Code API: ability to view and execute submitted code

  • Dataset API: ability to view, inspect and upload various datasets and type of assets

  • Sync API: ability to manage the syncronisation and information flow for Datasites with both lowside and highside servers

  • Notifications API: ability to launch local development server

More APIs are available for advanced use-cases such as custom workloads, support for large amounts of data, networking capabilities or enclaves, and these are currently in a beta stage.

  • API: ability to inspect the current APIs and expand it with custom defined ones

  • Worker Image APIs, Worker API, Worker Pool API: ability to define and launch custom images and spawn multiple worker pools and workers that run custom images to scale execution

  • Network API: ability to discover and be discovered through a centralised registry containing a large number of data sites

  • Attestation API, Enclave API: ability to conduct secure data joins between parties via secure enclaves

  • Blob Storage API: ability to use blob storage as a storage option, to scale up the data size supported or use data formats not natively supported by the Datasets API.

More documentation on these will be released soon.

Reviewing Specific APIs#

Some APIs directly expose the endpoints you would need to interact with, as such:

admin_client.users

In other cases, the API endpoint is just a submodule and you can explore them further as such:

admin_client.api.sync
admin_client.api.sync.sync_items?