File size: 3,656 Bytes
b29b21e
 
 
 
883d3ec
b29b21e
 
 
849c885
 
e4fb1f3
 
 
 
 
849c885
869dd82
4e85def
 
8e3afca
80f2fc3
59155e9
80f2fc3
3f8870d
80f2fc3
a50ad5d
4e85def
 
8e3afca
04f8d63
 
 
 
 
 
 
 
4e85def
 
8e3afca
59155e9
b96ca22
9069070
b96ca22
59155e9
 
88604f7
7069475
e5dce18
 
59155e9
 
 
 
 
acea25d
59155e9
 
 
4e85def
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
---
title: README
emoji: 🌍
colorFrom: gray
colorTo: purple
sdk: static
pinned: false
---
<!-- header start -->
<!-- 200823 -->
<a href="https://www.pruna.ai/" target="_blank" rel="noopener noreferrer">
    <img src="https://github.com/PrunaAI/pruna/raw/main/docs/assets/images/logo.png" 
         alt="PrunaAI" 
         style="width: 50%; min-width: 400px; display: block; margin: 0;">
</a>
<!-- header end -->

----

# 🌍 Join the Pruna AI community!
[![Twitter](https://img.shields.io/twitter/follow/PrunaAI?style=social)](https://twitter.com/PrunaAI)
[![GitHub](https://img.shields.io/github/stars/prunaai/pruna)](https://github.com/PrunaAI/pruna)
[![LinkedIn](https://img.shields.io/badge/LinkedIn-Connect-blue)](https://www.linkedin.com/company/93832878/admin/feed/posts/?feedType=following)
[![Discord](https://img.shields.io/badge/Discord-Join%20Us-blue?style=social&logo=discord)](https://discord.com/invite/JFQmtFKCjd)
[![Reddit](https://img.shields.io/reddit/subreddit-subscribers/PrunaAI?style=social)](https://www.reddit.com/r/PrunaAI/)

----

# πŸ’œ Simply make AI models faster, cheaper, smaller, greener!
[Pruna AI](https://www.pruna.ai/) makes AI models faster, cheaper, smaller, greener with the `pruna` package.
- It supports **various models including CV, NLP, audio, graphs for predictive and generative AI**.
- It supports **various hardware including GPU, CPU, Edge**.
- It supports **various compression algortihms including quantization, pruning, distillation, caching, recovery, compilation** that can be **combined together**.
- You can either **play on your own** with smash/compression configurations or **let the smashing/compressing agent** find the optimal configuration **[Pro]**.
- You can **evaluate reliable quality and efficiency metrics** of your base vs smashed/compressed models.
You can set it up in minutes and compress your first models in few lines of code!

----

# ⏩ How to get started?
You can smash your own models by installing pruna with pip: 
```
pip install pruna
```
or directly [from source](https://github.com/PrunaAI/pruna).

You can start with simple notebooks to experience efficiency gains with:

| Use Case | Free Notebooks |
|------------------------------------------------------------|----------------------------------------------------------------|
| **3x Faster Stable Diffusion Models** | ⏩ [Smash for free](https://colab.research.google.com/github/PrunaAI/pruna/blob/main/docs/tutorials/sd_deepcache.ipynb) |
| **Making your LLMs 4x smaller** | ⏩ [Smash for free](https://colab.research.google.com/github/PrunaAI/pruna/blob/main/docs/tutorials/llms.ipynb) |
| **Smash your model with a CPU only** | ⏩ [Smash for free](https://colab.research.google.com/github/PrunaAI/pruna/blob/main/docs/tutorials/cv_cpu.ipynb) |
| **Transcribe 2 hours of audio in less than 2 minutes with Whisper** | ⏩ [Smash for free](https://colab.research.google.com/github/PrunaAI/pruna/blob/main/docs/tutorials/asr_tutorial.ipynb) |
| **100% faster Whisper Transcription** | ⏩ [Smash for free](https://colab.research.google.com/github/PrunaAI/pruna/blob/main/docs/tutorials/asr_whisper.ipynb) |
| **Run your Flux model without an A100** | ⏩ [Smash for free](https://githubtocolab.com/PrunaAI/pruna/blob/1d68f74c132bd4045f2af55bb1e5c03bf2dde6a9/docs/tutorials/flux_small.ipynb) |
| **x2 smaller Sana in action** | ⏩ [Smash for free](https://colab.research.google.com/github/PrunaAI/pruna/blob/main/docs/tutorials/sana_diffusers_int8.ipynb) |

For more details about installation, free tutorials and Pruna Pro tutorials, you can check the [Pruna AI documentation](https://docs.pruna.ai/).

----