PeterJohnPyTorch
iPhone / developpeurs
Now You Can Use " PyTorch " on your iPhone even when you are in the train
or when your iPhone is OffLine.
****Why we Need " PyTorch " on the Edge Device such as iPhone??;
When "Torch" is a Lamp, "iPhone" becomes a Lamp Stand.
even When you are in the train or when your iPhone is OffLine.
So doNot put "Torch" on any Basket ( Hidden Place ) but put on your "iPhone".
So that the Light of the Lamp will Shine Before Others.
Scripture( Matthew 5:13;14-16 ) Says,
When the Light is Put on the Hill,
the City on the Hill canNot be Hidden.
****Matthew 5:13;14-16, ESV;
5:13 “You are the salt of the earth, but if salt has lost its taste,
how shall its saltiness be restored?
5:14 “You are the light of the world. A city set on a hill canNot be hidden.
5:15 Nor do people light a lamp and put it under a basket, but on a stand,
and it gives light to all in the house.
5:16 In the same way, let your light shine before others,
so that they may see your good works
and give glory to your Father who is in heaven.
****TutorialSeason007;
We Prepared some of Examples
to tell you what you Can do Using PeterJohnPyTorch.
s001QuestAnswerLibTorch.py;
This isNot PyTorch but just LibTorch.
QuestionAnswering Demo with LibTorch.
You canNot Customize almost Anything
'cause LibTorch is Called Via Swift
and 'cause Swift Needs to be Compiled
with XCode on MacOS.
This example Loads any Model from "qa360_quantized.ptl" file,
which includes "TorchScript".
s002ProvisionPyTorchTensor.py;
This Shows PeterJohnPyTorch Can Use Tensor.
s003ProvisionPyTorchAutoGradFoundation.py;
This Shows PeterJohnPyTorch Can Use AutoGrad.
s004ProvisionPyTorchAutoGrad.py;
This Shows PeterJohnPyTorch Can Use AutoGrad.
s005ProvisionPyTorchNN.py;
This Shows PeterJohnPyTorch Can Use torch.nn (NeuralNetwork).
s006ProvisionPyTorchNNoptimizer.py;
This Shows PeterJohnPyTorch Can Use torch.nn (NeuralNetwork)
and Optimizer.
This Uses SGD (Stochastic Gradient Descent ) as the Optimizer.
s007QuestAnswerPyTorch.py;
Now you Can See Not libTorch demo But PyTorch demo
about QuestionAnswering.
You can Customize, for example, Tokenizer
in "pjQuestionAnswering.py"
Using PyTorch on iPhone.
This example Loads any Model from "qa360_quantized.ptl" file,
which includes "TorchScript".
s008QuestAnswerTransformers.py;
QuestionAnswering Demo with PyTorch and Transformers.
After you touched "Run Script",
This example begins to Download "model.safetensors"(265.5MBytes),
We Recommend that you make a copy of "model.safetensors",
outside of this App "PeterJohnPyTorch",
using Apple's "Files.app".
So that, Even if you uninstalled this App,
After you installed this App "PeterJohnPyTorch" Again,
you can put back "model.safetensors" file to "/images" directory
of this App "PeterJohnPyTorch".
This example Loads any Model from "model.safetensors" file,
which is the Standard format of "HuggingFace".
****Some of Restrictions that We know Currently;
1) CanNot Trace any Model files using torch.jit.trace() Function;
Since Now "PeterJohnPyTorch" canNot Create any Traced Model,
Also it canNot Create either ".ptl"("PyTorch Lite" format) file
Nor ".pt"("PyTorch" format) file right now.
2) CanNot Script any Model files using torch.jit.script() Function;
Since Now "PeterJohnPyTorch" canNot Create any Scripted Model,
Also it canNot Create either ".ptl"("PyTorch Lite" format) file
Nor ".pt"("PyTorch" format) file right now.
Now you Need to
save any Model as "model.safetensors",
and use the Model via PyTorch or via Transformers,
Means Write the Logic via PyTorch or via Transformers.
3) Not Implement "MPS" (Metal Performance Shader) backends Yet;
So Now you Need to Specify "CPU" as Device.
The "CPU" backends is just the First Step Before the "GPU" backends.
****
Enjoy PeterJohnPyTorch even when you are in the train
or your iPhone is OffLIne.
--Yasushi Yassun Obata
Quoi de neuf dans la dernière version ?
'OnnxStream' Stable-Diffusion XL Turbo Version;
Sorry, this isNot 'PyTorch' Stable-Diffusion But 'OnnxStream' Stable-Diffusion
'cause we wanted to run Stable-Diffusion on the iPhone which has Less than 6GBytes RAM
such as iPhone6Splus (2GBytes RAM), iPhoneXR(3GBytes RAM).
****Requirements for Model;
More than 19GBytes DiskSpace.
we Recommend
More than 38GBytes ( 19GBytes X 2 ) DiskSpace
to BackUp the Models from this app's Model folder
to 'On My iPhone' using Apple 'Files' app
and to Restore the Models from 'On My iPhone'
to this app's Model folder using Apple 'Files' app.
if you don't BackUp the Models on iPhone,
it Needs Only 19GBytes.
****What we Prepared at this time;
TutorialSeason008/s001StableDiffusion.py
this is a Python Bindings of 'OnnxStream' Stable-Diffusion.
and this also provides the Function to Download Models.
'cause, though 'OnnxStream' Stable-Diffusion executes 'Curl' command
using 'System' command to download the Models,
iPhone canNot Execute both 'System' command and 'Curl' command.
****iPhone6Splus Test Result;
Diffusion Takes almost 80 seconds,
Decoding Takes almost 160 seconds,
Total almost 240 seconds. ( 6 Minutes )
****iPhoneXR Test Result;
Diffusion Takes almost 40 seconds,
Decoding Takes almost 100 seconds,
Total almost 140 seconds. ( 2 Minutes 20 Seconds )
There will be more room for improvements about the speed and the quality of 'txt2img'.
But Even if any Progress was made in the area of 'txt2img',
Human canNot invent anything New,
Human can Only Discover what Glorious Father prepared and allowed Human to Discover.
****1st John 5:20-21, ESV;
5:20 And we know that the Son of God has come and has given us understanding,
so that we may know him who is true; and we are in him who is true, in his Son Jesus Christ.
He is the true God and eternal life.
5:21 Little children, keep yourselves from idols.
--Yasushi Yassun Obata