E2E OpenVino Phi3 Vision
codemicrosoft-phi-cookbook06.E2E
Export
[14]
[15]
[16]
[ ]
⌛ Phi-3-vision conversion started. Be patient, it may takes some time. ⌛ Load Original model
Loading checkpoint shards: 0%| | 0/2 [00:00<?, ?it/s]
Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
✅ Original model successfully loaded ⌛ Convert Input embedding model ✅ Input embedding model successfully converted ⌛ Convert Image embedding model
C:\Users\kinfeylo\AppData\Roaming\Python\Python311\site-packages\transformers\modeling_utils.py:4481: FutureWarning: `_is_quantized_training_enabled` is going to be deprecated in transformers 4.39.0. Please use `model.hf_quantizer.is_trainable` instead warnings.warn( C:\Users\kinfeylo\AppData\Roaming\Python\Python311\site-packages\transformers\models\clip\modeling_clip.py:276: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs! if attn_weights.size() != (bsz * self.num_heads, tgt_len, src_len): C:\Users\kinfeylo\AppData\Roaming\Python\Python311\site-packages\transformers\models\clip\modeling_clip.py:316: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs! if attn_output.size() != (bsz * self.num_heads, tgt_len, self.head_dim):
✅ Image embedding model successfully converted ⌛ Convert Image projection model
You are not running the flash-attention implementation, expect numerical differences.
✅ Image projection model successfully converted ⌛ Convert Language model
C:\Users\kinfeylo\AppData\Roaming\Python\Python311\site-packages\transformers\modeling_attn_mask_utils.py:114: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs! if (input_shape[-1] > 1 or self.sliding_window is not None) and self.is_causal: C:\Users\kinfeylo\AppData\Roaming\Python\Python311\site-packages\transformers\modeling_attn_mask_utils.py:162: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs! if past_key_values_length > 0: C:\Users\kinfeylo\.cache\huggingface\modules\transformers_modules\microsoft\Phi-3-vision-128k-instruct\6065b7a1a412feff7ac023149f65358b71334984\modeling_phi3_v.py:143: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs! if seq_len > self.original_max_position_embeddings: c:\Users\kinfeylo\.conda\envs\openvinodev\Lib\site-packages\nncf\torch\dynamic_graph\wrappers.py:86: TracerWarning: torch.tensor results are registered as constants in the trace. You can safely ignore this warning if you use this function to create tensors out of constant variables that would be the same every time you call this function. In any other case, this might cause the trace to be incorrect. op1 = operator(*args, **kwargs) C:\Users\kinfeylo\.cache\huggingface\modules\transformers_modules\microsoft\Phi-3-vision-128k-instruct\6065b7a1a412feff7ac023149f65358b71334984\modeling_phi3_v.py:381: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs! if attn_weights.size() != (bsz, self.num_heads, q_len, kv_seq_len): C:\Users\kinfeylo\.cache\huggingface\modules\transformers_modules\microsoft\Phi-3-vision-128k-instruct\6065b7a1a412feff7ac023149f65358b71334984\modeling_phi3_v.py:388: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs! if attention_mask.size() != (bsz, 1, q_len, kv_seq_len): C:\Users\kinfeylo\.cache\huggingface\modules\transformers_modules\microsoft\Phi-3-vision-128k-instruct\6065b7a1a412feff7ac023149f65358b71334984\modeling_phi3_v.py:400: TracerWarning: Converting a tensor to a Python boolean might cause the trace to be incorrect. We can't record the data flow of Python values, so this value will be treated as a constant in the future. This means that the trace might not generalize to other inputs! if attn_output.size() != (bsz, self.num_heads, q_len, self.head_dim): c:\Users\kinfeylo\.conda\envs\openvinodev\Lib\site-packages\torch\jit\_trace.py:168: UserWarning: The .grad attribute of a Tensor that is not a leaf Tensor is being accessed. Its .grad attribute won't be populated during autograd.backward(). If you indeed want the .grad field to be populated for a non-leaf Tensor, use .retain_grad() on the non-leaf Tensor. If you access the non-leaf Tensor by mistake, make sure you access the leaf Tensor instead. See github.com/pytorch/pytorch/pull/30531 for more informations. (Triggered internally at C:\actions-runner\_work\pytorch\pytorch\builder\windows\pytorch\build\aten\src\ATen/core/TensorBody.h:494.) if a.grad is not None:
✅ Language model successfully converted ⌛ Weights compression with int4_sym mode started
Output()
INFO:nncf:Statistics of the bitwidth distribution: +----------------+-----------------------------+----------------------------------------+ | Num bits (N) | % all parameters (layers) | % ratio-defining parameters (layers) | +================+=============================+========================================+ | 8 | 42% (54 / 129) | 40% (53 / 128) | +----------------+-----------------------------+----------------------------------------+ | 4 | 58% (75 / 129) | 60% (75 / 128) | +----------------+-----------------------------+----------------------------------------+
Output()
✅ Weights compression finished ✅ Phi-3-vision model conversion finished. You can find results in model\phi3-vision\int4
[18]
Dropdown(description='Device:', index=1, options=('CPU', 'GPU', 'AUTO'), value='GPU') [19]
[20]
WindowsPath('model/phi3-vision/int4') [21]
[22]
[23]
[24]
[25]
Special tokens have been added in the vocabulary, make sure the associated word embeddings are fine-tuned or trained.
Coding:
To create a pie chart representing marks obtained in an exam, we can use the matplotlib library in Python. Here is a step-by-step guide on how to do it:
1. First, import the necessary libraries. We need matplotlib for creating the pie chart and pandas for handling the data.
```python
import matplotlib.pyplot as plt
import pandas as pd
```
2. Next, we need to create a DataFrame from the given data.
```python
data = {'subject': ['maths', 'physics', 'chemistry', 'english'], 'marks': [340, 300, 2000, 150]}
df = pd.DataFrame(data)
```
3. Now, we can create the pie chart. We use the 'autop' parameter to display the percentage of each slice.
```python
plt = plt.pie(df['marks'], labels = df['subject'], autop=True, colors=['red','blue','yellow','green'])
```
4. Finally, we can display the chart and save it as a jpg file.
```python
plt.title('Marks obtained in an exam')
plt.savefig('imgs/im-3-3-3.jpg')
```
This code will create a pie chart representing the marks obtained in an exam. The chart will be saved as 'im-3-3-3.jpg'.