Hello again,
I’m wondering in all the works-around to optimize and reduce the size of the scripts and resources utilization.
I’m working in a big project and I’m struggling with this runtime limitations.
I would love to be able to code the logic of any big idea or project. I believe this must be possible to do it with Cardano.
I’m not Haskell expert, so I might be falling in some beginner mistakes handling data types or code structures.
I read some articles:
1 -
# Optimizations to reduce CPU and Mem consumption
In order to have an insight of which parts of the plutus script that are responsible for significant memory and cpu consumption, it is recommended to use the profiling tool made available in the plutus repository. Note that the profiling tool requires compiling plutus scripts to insert profiling instructions necessary for assessing performance. The profiling tool also requires the plutus script to be fully applied, which means that all the arguments to the plutus script (i.e. datum, redeemer and script context) shall also be produced. The profiling documentation can be found at [Profiling Scripts](https://plutus.readthedocs.io/en/latest/plutus/howtos/profiling-scripts.html).
The various workarounds and optimizations that can help in reducing the plutus script size as well as the execution steps and memory execution units are described in the subsequent sections.
## Avoiding higher-order functions and closures
The use of higher-order functions is a common programming paradigm to facilitate code reuse. Higher-order functions are widely used in the plutus library but may have a significant impact on cpu and memory consumption especially when functions passed as arguments contain closures. It is therefore recommended to rewrite specialized versions to avoid closures as far as possible. For instance, the plutus function `findOwnInput` makes use of the higher order function `find` to search for the current script input.
```haskell
findOwnInput :: ScriptContext -> Maybe TxInInfo
findOwnInput ScriptContext{scriptContextTxInfo=TxInfo{txInfoInputs},
scriptContextPurpose=Spending txOutRef} =
find (\TxInInfo{txInInfoOutRef} -> txInInfoOutRef == txOutRef) txInfoInputs
findOwnInput _ = Nothing
```
As can be seen, the reference to `txOutRef`, within the function’s body passed as argument to find, introduces a closure. This can increase cpu and memory consumption especially when list `txInfoInputs` contains several elements. If only the `TxOut` script input is required, `findOwnInput` can be rewritten as follows to avoid closures and to save on `Maybe` constructs.
```haskell
This file has been truncated. show original
https://plutus.readthedocs.io/en/latest/reference/writing-scripts/optimization.html?highlight=profiling
They mention:
Avoiding higher-order functions and closures.
Adding strictness on accumulators in recursive functions.
Common expression elimination.
Using error for faster failure.
Avoid monad do notation to handle pattern match failure: Can’t understand this one clearly.
2 - GitHub - Plutonomicon/plutarch-plutus: Typed eDSL for writing UPLC /ˈpluː.tɑːk/
I understand that this is about coding all again, with this language that is more low level and efficient that the Plutus libraries.
3 - contracts/dex/src/Minswap at main · CatspersCoffee/contracts · GitHub
I saw in the code of Minswap, that they convert the Plutus Script with Plutonomy (GitHub - well-typed/plutonomy: An optimizer for untyped plutus core )
import qualified Plutonomy
{-# INLINEABLE mkFactoryPolicy #- }
mkFactoryPolicy :: MintingPolicy
mkFactoryPolicy =
Plutonomy.optimizeUPLC $
Plutonomy.mintingPolicyToPlutus originalFactoryPolicy
But im not sure about the final code. How we can be sure that the code is reliable after it is passed trough Plutonomy.optimizeUPLC and also this is only compatible with specific older version of Plutus:
plutus-1efbb276e
plutus-4710dff2e
plutus-f680ac697 (cardano-node-1.35.1)
I would appreciate any help in this matter.
If you can share your experience codding big projects and solutions you fount that will mean a lot for me.
Thank you so much.
Regards, Manu