PyTorch
Created at | 22 Feb 2024 |
Editor(s) | Alejandro Carmena Magro , JD-017 Machine Learning Ops |
Supervisor(s) | Alfonso Medela , JD-005 Technical Manager & Person Responsible for Regulatory Compliance |
Approval | - Approved |
Description​
PyTorch is a deep learning library widely used for applications in computer vision and natural language processing. It provides a flexible and intuitive interface for building and training neural networks, leveraging automatic differentiation and GPU acceleration for efficient computation. PyTorch is favored by researchers and developers for its dynamic computation graph, ease of use, and extensive ecosystem of tools and libraries.
General details​
- Developer(s): Facebook AI Research lab (FAIR), along with a community of contributors.
- Open source: Yes
- Language(s): The core of PyTorch is written in C++, with its front end accessible in Python.
- Repository: https://github.com/pytorch/pytorch
- License: Modified BSD
- Operating system(s): Compatible with Linux, Windows, and macOS.
- Actively maintained: Yes (within the past week)
Intended use on the device​
The SOUP is used in the medical device for the following specific purposes only:
- Use the building blocks provided by the library to create more complex deep learning models for computer vision.
- Apply convenient metrics operating on tensors to the predictions of deep learning models.
- Load serialized deep learning models into memory.
Requirements​
For the integration and safe usage of this SOUP within a software system, it's important to outline both functional and performance requirements. These requirements help mitigate risks and ensure compatibility and performance standards are met.
Functional​
- GPU support: Offer comprehensive support for CUDA-enabled GPUs, allowing for automatic differentiation and tensor operations to be offloaded to the GPU for accelerated computation. This includes the ability to easily switch between CPU and GPU environments, and manage multiple GPUs for parallel processing.
- Modularity: Allow for the creation, training, and inference of neural networks using modular components that can be easily customized and combined.
- Extensibility: Users should be able to extend it with custom modules and optimizers, allowing for innovation and adaptation to new research or application requirements.
- Data management: Efficient data loaders for handling and preprocessing large datasets for training and evaluation purposes.
- Serialization: Support for saving and loading models and their parameters, facilitating model checkpointing, sharing, and deployment.
Performance​
- Speed: Provide fast execution times for training and inference processes compared to other available deep learning frameworks.
- Scalability: Efficiently scale its computations across multiple CPUs and GPUs to accommodate the needs of both small-scale experiments and large-scale training tasks.
- Resource utilization: Effective management of memory usage to optimize for both speed and capacity.
- Distributed training: Built-in support for distributed training to enable parallelism across multiple devices and nodes, improving training times for large models and datasets.
System requirements​
Establishing minimum software and hardware requirements is important to mitigate risks, such as security vulnerabilities, performance issues, or compatibility problems, and to ensure that the SOUP functions effectively within the intended environment.
Software​
After evaluation, we find that there are no specific software requirements for this SOUP. It works properly on standard computing devices, which includes our environment.
Hardware​
After evaluation, we find that there are no specific hardware requirements for this SOUP. It works properly on standard computing devices, which includes our environment.
Documentation​
The official SOUP documentation can be found at https://pytorch.org/docs/stable/index.html
Additionally, a criterion for validating the SOUP is that all the items of the following checklist are satisfied:
- The vendor maintains clear and comprehensive documentation of the SOUP describing its functional capabilities, user guidelines, and tutorials, which facilitates learning and rapid adoption.
- The documentation for the SOUP is regularly updated and clearly outlines every feature utilized by the medical device, doing so for all integrated versions of the SOUP.
Related software items​
We catalog the interconnections between the microservices within our software architecture and the specific versions of the SOUP they utilize. This mapping ensures clarity and traceability, facilitating both the understanding of the system's dependencies and the management of SOUP components.
Although the title of the section mentions software items, the relationship with SOUP versions has been established with microservices (also considered software items, by the way) because each one is inside a different Docker container and, therefore, has its own isolated runtime environment.
SOUP version | Software item(s) |
---|---|
1.7.0 | APASI-SEGMENTER APASI-CLASSIFIER ASALT ASCORAD-SEGMENTER ASCORAD-CLASSIFIER ICD MULTICLASS CLASSIFIER ICD BINARY CLASSIFIER BINARY REFERRER QUALITY VALIDATOR DOMAIN VALIDATOR |
2.2.1 | AGPPGA ALADIN APULSI |
Related risks​
The following are risks applicable to this SOUP from the table found in document R-TF-013-002 Risk management record_2023_001
:
- 58. SOUP presents an anomaly that makes it incompatible with other SOUPs or with software elements of the device.
- 59. SOUP is not being maintained nor regularly patched.
- 60. SOUP presents cybersecurity vulnerabilities.
Lists of published anomalies​
The incidents, anomalies, known issues or changes between versions for this SOUP can be found at:
- PyTorch Release Notes
- PyTorch NVIDIA Container Image Release Notes
- PyTorch Community Channels
- PyTorch Slack
- PyTorch Issues
History of evaluation of SOUP anomalies​
29 Feb 2024​
- Reviewer of the anomalies: Alejandro Carmena Magro
- Version(s) of the SOUP reviewed: 1.7.0, 2.2.1
No anomalies have been found.
Record signature meaning​
- Author: JD-004
- Reviewer: JD-003
- Approver: JD-005