BigBeluga - Smart Money ConceptsSmart Money Concepts (SMC) is a comprehensive toolkit built around the around the principles of "smart money" behavior, which refers to the actions and strategies of institutional investors.
SMC transcends traditional technical analysis by delving deeper into this framework. This approach allows users to decipher the actions of these influential players, anticipate their potential impact on market dynamics, and gain insights beyond just price movements.
This all-in-one toolkit provide the user with a unique experience by automating most of the basic and advanced concepts on the chart, saving them time and improving their trading ideas.
🔹Real-time market structure analysis simplifies complex trends by pinpointing key support, resistance, and breakout levels.
🔹Advanced order block analysis leverages detailed volume data to pinpoint high-demand zones, revealing internal market sentiment and predicting potential reversals. This analysis utilizes bid/ask zones to provide supply/demand insights, empowering informed trading decisions.
🔹Imbalance Concepts (FVG and Breakers) allows traders to identify potential market weaknesses and areas where price might be attracted to fill the gap, creating opportunities for entry and exit
🔹Swing failure patterns help traders identify potential entry points and rejection zones based on price swings
🔹Liquidity Concepts, our advanced liquidity algorithm, pinpoints high-impact events, allowing you to predict market shifts, strong price reactions, and potential stop-loss hunting zones. This gives traders an edger to make informed trading decisions based on multi-timeframe liquidity dynamics
🔶 FEATURES
The indicator has quite a lot of features that are provided below:
Swing market structure
Internal market structure
Mapping structure
Discount/Premium zone
Adjustable market structure
Strong/Weak H&L
Sweep
Volumetric Order block / Breakers
Fair Value Gaps / Breakers (multi-timeframe)
Swing Failure Patterns (multi-timeframe)
Deviation area
Equal H&L
Liquidity Prints
Buyside & Sellside
Sweep Area
Highs and Lows (multi-timeframe)
🔶 BASIC DEMONSTRATION
The preceding image illustrates the market structure functionality within the Smart Money Concepts indicator.
Solid lines: These represent the core indicator's internal structure, forming the foundation for most other components. They visually depict the overall market direction and identify major reversal points marked by significant price movements (denoted as 'x').
Dotted lines: These represent an alternative internal structure with the potential to drive more rapid market shifts. This is particularly relevant when a significant gap exists in the established swing structure, specifically between the Break of Structure (BOS) and the most recent Change of High/Low (CHoCH). Identifying these formations can offer opportunities for quicker entries and potential short-term reversals.
Sweeps (x): These signify potential turning points in the market where liquidity is removed from the structure. This suggests a possible trend reversal and presents crucial entry opportunities. Sweeps are identified within both swing and internal structures, providing valuable insights for informed trading decisions.
🔶 USAGE & EXAMPLES
The image above showcases a detailed example of several features from our toolkit that can be used in conjunction for a comprehensive analysis.
Price rejecting from the bullish order block (POC), while printing inside a bullish SFP and internal structure turning bullish (Internal CHoCH).
The image further demonstrates how two bearish order blocks could potentially act as resistance zones when prices approach those levels. These areas might also offer attractive locations to place take-profit orders.
The price has reached our first take-profit level, but is exhibiting some signs of weakness, suggesting a potential pullback which could put the trade at higher risk.
On the other hand, the price action currently exhibits strong bullish sentiment, suggesting favorable entry points and a potential upward trend.
The price has now fully reached our take-profit zone and is also exhibiting bearish confluence, indicating a potential price reversal or trend shift.
🔶 USING CONFLUENCE
The core principle behind the success of this toolkit lies in identifying "confluence." This refers to the convergence of multiple trading indicators all signaling the same information at a specific point or area. By seeking such alignment, traders can significantly enhance the likelihood of successful trades.
In the image above we can see a few examples of the indicator used in confluence with other metrics included in the toolkit.
Liquidity Prints within order blocks
SFP close to the POC
Sweep in liquidity close to a fair value gaps
These are just a few examples of what applying confluence can look like.
🔶 SETTINGS
Window: limit calculation period
Swing: limit drawing function
Internal: a period of the beginning of the internal structure
Mapping structure: show structural points
Algorithmic Logic: (Extreme-Adjusted) Use max high/low or pivot point calculation
Algorithmic loopback: pivot point look back
Premium / Discount: Lookback period of the pivot point calculation
Show Last: Amount of Order block to display
Hide Overlap: hide overlapping order blocks
Construction: Size of the order blocks
Fair value gaps: Choose between normal FVG or Breaker FVG
Mitigation: (close - wick- avg) point to mitigate the order block/imbalance
SFP lookback: find a higher / lower point to improve accuracy
Threshold: remove less relevant SFP
Equal h&L: (short-mid-long term) display longer term
Any Alert(): Trigger alerts based on the selected inputs
"algo"に関するスクリプトを検索
Fractal Consolidations [Pro+]Introduction:
Fractal Consolidations Pro+ pushes the boundaries of Algorithmic Price Delivery Analysis. Tailored for traders seeking precision and efficiency to unlock hidden insights, this tool empowers you to dissect market Consolidations on your terms, live, in all asset classes.
What is a Fractal Consolidation?
Consolidations occur when price is trading in a range. Normally, Consolidation scripts use a static number of "lookback candles", checking whether price is continuously trading inside the highest and lowest price points of said Time window.
After years spent studying price action and numerous programming attempts, this tool succeeds in veering away from the lookback candle approach. This Consolidation script harnesses the delivery mechanisms and Time principles of the Interbank Price Delivery Algorithm (IPDA) to define Fractal Consolidations – solely based on a Timeframe Input used for context.
Description:
This concept was engineered around price delivery principles taught by the Inner Circle Trader (ICT). As per ICT, it's integral for an Analyst to understand the four phases of price delivery: Consolidation , Expansion , Retracement , and Reversal .
According to ICT, any market movement originates from a Consolidation, followed by an Expansion .
When Consolidation ranges begin to break and resting liquidity is available, cleaner Expansions will take place. This tool's value is to visually aid Analysts and save Time in finding Consolidations in live market conditions, to take advantage of Expansion moves.
CME_MINI:ES1! 15-Minute Consolidation setting up an Expansion move, on the 10 Minute Chart:
Fractal Consolidations Pro+ doesn't only assist in confirming Higher Timeframe trend continuations and exposing opportunities on Lower Timeframes. It's also designed for both advanced traders and new traders to save Time and energy in navigating choppy or rangebound environments.
CME_MINI:ES1! 30 Minute Consolidation forming Live, on the 5 Minute Chart:
By analyzing past price action, traders will find algorithmic signatures when Consolidations are taking place, therefore providing a clearer view of where and when price is likely to contract, continue consolidating, breakout, retrace, or reverse. A prominent signature to consider when using this script is ICT's Market Maker Buy/Sell Models. These signatures revolve around the engineering of Consolidations to manipulate price in a specific direction, to then reverse at the appropriate Time. Each stage of the Market Maker Model can be identified and taken advantage of using Fractal Consolidations.
CME_MINI:NQ1! shift of the Delivery Curve from a Sell Program to a Buy Program, Market Maker Buy Model
Key Features:
Tailored Timeframes: choose the Timeframe that suits your model. Whether you're a short-term enthusiast eyeing 1 Hour Consolidations or a long-term trend follower analyzing 4 Hour Consolidations, this tool gives you the freedom to choose.
FOREXCOM:EURUSD Fractal Consolidations on a 15 Minute Chart:
Auto-Timeframe Convenience: for those who prefer a more dynamic and adaptive approach, our Auto Timeframe feature effortlessly adjusts to the most relevant Timeframe, ensuring you stay on top of market consolidations without manually adjusting settings.
Consolidation Types: define consolidations as contractions of price based on either its wick range or its body range.
COMEX:GC1! 4 Hour Consolidation differences between Wick-based and Body-based on a 1 Hour Chart:
Filtering Methods: combine previous overlapping Consolidations, merging them into one uniform Consolidation. This feature is subject to repainting only while a larger Consolidation is forming , as smaller Consolidations are confirmed. However once established, the larger Consolidation will not repaint .
FOREXCOM:GBPUSD 15 Minute Consolidation Differences between Filter Consolidations ON and OFF:
IPDA Data Range Filtering: this feature gives the Analyst control for selective visibility of Consolidations in the IPDA Data Range Lookback . The Analyst can choose between 20, 40, and 60 days as per ICT teachings, or manually adjust through Override.
INDEX:BTCUSD IPDA40 Data Range vs. IPDA20 Data Range:
Extreme Float: this feature provides reference points when the price is outside the highest or lowest liquidity levels in the chosen IPDA Data Range Lookback. These Open Float Extremes offer critical insights when the market extends beyond the Lookback Consolidation Liquidity Levels . This feature helps identify liquidity extremes of interest that IPDA will consider, which is crucial for traders in understanding market movements beyond the IPDA Data Ranges.
INDEX:ETHUSD Extreme Float vs. Non-Extreme Float Liquidity:
IPDA Override: the Analyst can manually override the default settings of the IPDA Data Range Lookback, enabling more flexible and customized analysis of market data. This is particularly useful for focusing on recent price actions in Lower Timeframes (like viewing the last 3 days on a 1-minute timeframe) or for incorporating a broader data range in Higher Timeframes (like using 365 days to analyze Weekly Consolidations on a daily timeframe).
Liquidity Insight: gain a deeper understanding of market liquidity through customizable High Resistance Liquidity Run (HRLR) and Low Resistance Liquidity Run (LRLR) Consolidation colors. This feature helps distinguishing between HRLR (high resistance, delayed price movement) and LRLR (low resistance, smooth price movement) Consolidations, aiding in quick assessment of market liquidity types.
TVC:DXY Low Resistance vs. High Resistance Consolidation Liquidity Behaviour and Narrative:
Liquidity Raid Type: decide whether to categorize a Consolidation liquidity raid by a wick or body trading through a level.
CBOT:ZB1! Wick vs. Body Liquidity Raid Type:
Customizable User Interface: tailor the visual representation to align with your preferences. Personalize your trading experience by adjusting the colors of consolidation liquidity (highs and lows) and equilibrium, as well as line styles.
DNA GRAVITY PRICE V1 PINESCRIPTLABSWe can observe that this indicator displays the range within which the asset fluctuates around the average price, and its behavior depends on the parameters of amplitude and angular frequency. "price_mas" is a measure calculated as part of the indicator. It is derived by adding an adjusted amplitude (A_mas) multiplied by the cosine of the combination of angular frequency (w), time, and a phase shift (phi) to the average price (P0). This calculated value oscillates around the actual asset price and is used to identify potential turning points and the range where the price has established itself within the specified lookback period.
2.- At its core, the indicator utilizes the innovative concept of 'price_mas,' a calculated metric visualized in three essential colors: green to indicate low levels, blue for medium levels, and red for high levels. These colors reflect the position of the price in relation to a range determined by historical highs and lows.
In the context of the "DNA GRAVITY PRICE V1 " indicator, low, medium, and high levels specifically refer to the calculated value of 'price_mas,' which is a derived measure within the indicator. They do not directly refer to the actual asset price but rather to a calculated value that the indicator uses to analyze and predict the behavior of the asset's price.
This algorithm stands out for its ability to capture the 'strength' of the price through the 'price_mas' zones. Once the price exits the zones marked by the 'price_mas' (red, blue, and green plots), it tends to return with significant force.
Buy & Sell Signals:
Buy Signal: If the price and the Donchian lines cross above the high threshold, visually represented by red diamonds, it indicates a strong bullish momentum. This not only shows that the price is rising but also that the trend is strong enough to push the Donchian lines, which represent price extremes over a certain period, above the threshold. This convergence of movements, marked by the crossing over the red diamonds, suggests a higher probability of the bullish trend continuing.
Sell Signal: Similarly, if the price and the Donchian lines fall below the low threshold, visualized as green diamonds, this signals a significant bearish momentum. The simultaneous decline of the price and the Donchian lines below this threshold, marked by the green diamonds, indicates that not only is the price decreasing, but the bearish trend is strong enough to influence the price extremes calculated by the Donchian lines.
Configuration:
-The "Initial Dynamic Length of MAS Price" parameter controls the smoothness and sensitivity of the indicator. A high value smooths the Simple Moving Average (SMA), making the indicator less responsive to short-term price fluctuations. On the other hand, a low value makes the indicator more sensitive to short-term price fluctuations, generating faster and more volatile signals
-This parameter, "MAS Amplitude Percentage," determines the amplitude as a percentage. Increasing the Initial Dynamic Price will result in a larger amplitude relative to the price, leading to wider ranges for the indicator. Decreasing this value will have the opposite effect, reducing the amplitude relative to the price. Increasing "A_mas_pct" can make signals more extreme and less frequent, while decreasing it will make signals smoother and more frequent.
-This parameter, "Angular Frequency of MAS," affects the frequency of oscillations in the calculation of the "Initial Dynamic Price." A higher value of "w" will make the oscillations faster and more frequent, which means that the indicator will be more responsive to abrupt price changes. Conversely, a lower value will make the oscillations slower and smoother, making the indicator less sensitive to rapid price changes. Modifying ""Angular Frequency of MAS,"" directly impacts the frequency of oscillations in the indicator.
Español:
Podemos observar que este indicador muestra el rango en el cual el activo fluctúa alrededor del precio promedio y su comportamiento depende de los parámetros de amplitud y frecuencia angular. "price_mas" es una medida calculada como parte del indicador. Se deriva al sumar una amplitud ajustada (A_mas) multiplicada por el coseno de la combinación de frecuencia angular (w), tiempo y un desplazamiento de fase (phi) al precio promedio (P0). Este valor calculado oscila alrededor del precio real del activo y se utiliza para identificar posibles puntos de giro y el rango donde el precio se ha establecido dentro del período de búsqueda especificado.
En su núcleo, el indicador utiliza el innovador concepto de 'price_mas', una métrica calculada visualizada en tres colores esenciales: verde para indicar niveles bajos, azul para niveles medios y rojo para niveles altos. Estos colores reflejan la posición del precio en relación con un rango determinado por los máximos y mínimos históricos.
En el contexto del indicador "DNA GRAVITY PRICE V1", los niveles bajos, medios y altos se refieren específicamente al valor calculado de 'price_mas', que es una medida derivada dentro del indicador. No se refieren directamente al precio real del activo, sino a un valor calculado que el indicador utiliza para analizar y predecir el comportamiento del precio del activo.
Este algoritmo se destaca por su capacidad para capturar la 'fortaleza' del precio a través de las zonas de 'price_mas'. Una vez que el precio sale de las zonas marcadas por 'price_mas' (trazas rojas, azules y verdes), tiende a regresar con una fuerza significativa. Este comportamiento es crucial para los operadores, ya que proporciona oportunidades tanto para capitalizar las retracciones de precios como para anticipar posibles cambios de tendencia.
Señales de Compra y Venta:
Señal de Compra: Si el precio y las líneas Donchian cruzan por encima del umbral alto, visualmente representado por diamantes rojos, indica un fuerte impulso alcista. Esto no solo muestra que el precio está aumentando, sino que la tendencia es lo suficientemente fuerte como para empujar las líneas Donchian, que representan los extremos de precio durante un período determinado, por encima del umbral. Esta convergencia de movimientos, marcada por el cruce sobre los diamantes rojos, sugiere una mayor probabilidad de que la tendencia alcista continúe.
Señal de Venta: De manera similar, si el precio y las líneas Donchian caen por debajo del umbral bajo, visualizado como diamantes verdes, esto señala un fuerte impulso bajista. La caída simultánea del precio y las líneas Donchian por debajo de este umbral, marcada por los diamantes verdes, indica que no solo el precio está disminuyendo, sino que la tendencia bajista es lo suficientemente fuerte como para influir en los extremos de precio calculados por las líneas Donchian.
Configuración:
El parámetro "Longitud Dinámica Inicial de MAS Price" controla la suavidad y la sensibilidad del indicador. Un valor alto suaviza el Promedio Móvil Simple (SMA), lo que hace que el indicador sea menos sensible a las fluctuaciones de precio a corto plazo. Por otro lado, un valor bajo hace que el indicador sea más sensible a las fluctuaciones de precio a corto plazo, generando señales más rápidas y volátiles.
Este parámetro, "Porcentaje de Amplitud de MAS," determina la amplitud como un porcentaje. Aumentar el valor de "Longitud Dinámica Inicial de MAS Price" dará como resultado una amplitud más grande en relación con el precio, lo que conducirá a rangos más amplios para el indicador. Disminuir este valor tendrá el efecto contrario, reduciendo la amplitud en relación con el precio. Aumentar "Porcentaje de A_mas" puede hacer que las señales sean más extremas y menos frecuentes, mientras que disminuirlo hará que las señales sean más suaves y más frecuentes.
Este parámetro, "Frecuencia Angular de MAS," afecta la frecuencia de las oscilaciones en el cálculo del "Precio Móvil Simple Inicial." Un valor más alto de "w" hará que las oscilaciones sean más rápidas y frecuentes, lo que significa que el indicador será más receptivo a cambios abruptos en el precio. Por otro lado, un valor más bajo hará que las oscilaciones sean más lentas y suaves, haciendo que el indicador sea menos sensible a cambios rápidos en el precio. Modificar "Frecuencia Angular de MAS" afecta directamente la frecuencia de las oscilaciones en el indicador.
KNN Regression [SS]Another indicator release, I know.
But note, this isn't intended to be a stand-alone indicator, this is just a functional addition for those who program Machine Learning algorithms in Pinescript! There isn't enough content here to merit creating a library for (it's only 1 function), but it's a really useful function for those who like machine learning and Nearest Known Neighbour Algos (or KNN).
About the indicator:
This indicator creates a function to perform KNN-based regression.
In contrast to traditional linear regression, KNN-based regression has the following advantages over linear regression:
Advantages of KNN Regression vs. Linear Regression:
🎯 Non-linearity: KNN is a non-parametric method, meaning it makes no assumptions about the underlying data distribution. This allows it to capture non-linear relationships between features and the target variable.
🎯Simple Implementation: KNN is conceptually simple and easy to understand. It doesn't require the estimation of parameters, making it straightforward to implement.
🎯Robust to Outliers: KNN is less sensitive to outliers compared to linear regression. Outliers can have a significant impact on linear regression models, but KNN tends to be less affected.
Disadvantages of KNN Regression vs. Linear Regression:
🎯 Resource Intensive for Computation: Because KNN operates on identifying the nearest neighbors in a dataset, each new instance has to be searched for and identified within the dataset, vs. linear regression which can create a coefficient-based model and draw from the coefficient for each new data point.
🎯Curse of Dimensionality: KNN performance can degrade with an increasing number of features, leading to a "curse of dimensionality." This is because, in high-dimensional spaces, the concept of proximity becomes less meaningful.
🎯Sensitive to Noise: KNN can be sensitive to noisy data, as it relies on the local neighborhood for predictions. Noisy or irrelevant features may affect its performance.
Which is better?
I am very biased, coming from a statistics background. I will always love linear regression and will always prefer it over KNN. But depending on what you want to accomplish, KNN makes sense. If you are using highly skewed data or data that you cannot identify linearity in, KNN is probably preferable.
However, if you require precise estimations of ranges and outliers, such as creating co-integration models, I would advise sticking with linear regression. However, out of curiosity, I exported the function into a separate dummy indicator and pulled in data from QQQ to predict SPY close, and the results are actually very admirable:
And plotted with showing the standard error variance:
Pretty impressive, I must say I was a little shocked, it's really giving linear regression a run for its money. In school I was taught LinReg is the gold standard for modeling, nothing else compares. So as with most things in trading, this is challenging some biases of mine ;).
Functionality of the function
I have permitted 3 types of KNN regression. Traditional KNN regression, as I understand it, revolves around clustering. ( Clustering refers to identifying a cluster, normally 3, of identical cases and averaging out the Dependent variable in each of those cases) . Clustering is great, but when you are working with a finite dataset, identifying exact matches for 2 or 3 clusters can be challenging when you are only looking back at 500 candles or 1000 candles, etc.
So to accommodate this, I have added a functionality to clustering called "Tolerance". And it allows you to set a tolerance level for your Euclidean distance parameters. As a default, I have tested this with a default of 0.5 and it has worked great and no need to change even when working with large numbers such as NQ and ES1!.
However, I have added 2 additional regression types that can be done with KNN.
#1 One is a regression by the last IDENTICAL instance, which will find the most recent instance of a similar Independent variable and pull the Dependent variable from that instance. Or
#2 Average from all IDENTICAL instances.
Using the function
The code has the instructions for integrating the function into your own code, the parameters, and such, so I won't exhaust you with the boring details about that here.
But essentially, it exports 3, float variables, the Result, the Correlation, and the simplified R2.
As this is KNN regression, there are no coefficients, slopes, or intercepts and you do not need to test for linearity before applying it.
Also, the output can be a bit choppy, so I tend to like to throw in a bit of smoothing using the ta.sma function at a deault of 14.
For example, here is SPY from QQQ smoothed as a 14 SMA:
And it is unsmoothed:
It seems relatively similar but it does make a bit of an aesthetic difference. And if you are doing it over 14, there is no data loss and it is still quite reactive to changes in data.
And that's it! Hopefully you enjoy and find some interesting uses for this function in your own scripts :-).
Safe trades everyone!
IPDA Standard Deviations [DexterLab x TFO x toodegrees]> Introduction and Acknowledgements
The IPDA Standard Deviations tool encompasses the Time and price relationship as studied by @TraderDext3r .
I am not the creator of this Theory, and I do not hold the answers to all the questions you may have; I suggest you to study it from Dexter's tweets, videos, and material.
This tool was born from a collaboration between @TraderDext3r, @tradeforopp and I, with the objective of bringing a comprehensive IPDA Standard Deviations tool to Tradingview.
> Tool Description
This is purely a graphical aid for traders to be able to quickly determine Fractal IPDA Time Windows, and trace the potential Standard Deviations of the moves at their respective high and low extremes.
The disruptive value of this tool is that it allows traders to save Time by automatically adapting the Time Windows based on the current chart's Timeframe, as well as providing customizations to filter and focus on the appropriate Standard Deviations.
> IPDA Standard Deviations by TraderDext3r
The underlying idea is based on the Interbank Price Delivery Algorithm's lookback windows on the daily chart as taught by the Inner Circle Trader:
IPDA looks at the past three months of price action to determine how to deliver price in the future.
Additionally, the ICT concept of projecting specific manipulation moves prior to large displacement upwards/downwards is used to navigate and interpret the priorly mentioned displacement move. We pay attention to specific Standard Deviations based on the current environment and overall narrative.
Dexter being one of the most prominent Inner Circle Trader students, harnessed the fractal nature of price to derive fractal IPDA Lookback Time Windows for lower Timeframes, and studied the behaviour of price at specific Deviations.
For Example:
The -1 to -2 area can initiate an algorithmic retracement before continuation.
The -2 to -2.5 area can initiate an algorithmic retracement before continuation, or a Smart Money Reversal.
The -4 area should be seen as the ultimate objective, or the level at which the displacement will slow down.
Given that these ideas stem from ICT's concepts themselves, they are to be used hand in hand with all other ICT Concepts (PD Array Matrix, PO3, Institutional Price Levels, ...).
> Fractal IPDA Time Windows
The IPDA Lookbacks Types identified by Dexter are as follows:
Monthly – 1D Chart: one widow per Month, highlighting the past three Months.
Weekly – 4H to 8H Chart: one window per Week, highlighting the past three Weeks.
Daily – 15m to 1H Chart: one window per Day, highlighting the past three Days.
Intraday – 1m to 5m Chart: one window per 4 Hours highlighting the past 12 Hours.
Inside these three respective Time Windows, the extreme High and Low will be identified, as well as the prior opposing short term market structure point. These represent the anchors for the Standard Deviation Projections.
> Tool Settings
The User is able to plot any type of Standard Deviation they want by inputting them in the settings, in their own line of the text box. They will always be plotted from the Time Windows extremes.
As previously mentioned, the User is also able to define their own Timeframe intervals for the respective IPDA Lookback Types. The specific Timeframes on which the different Lookback Types are plotted are edge-inclusive. In case of an overlap, the higher Timeframe Lookback will be prioritized.
Finally the User is able to filter and remove Standard Deviations in two ways:
"Remove Once Invalidated" will automatically delete a Deviation once its outer anchor extreme is traded through.
Manual Toggles will allow to remove the Upward or Downward Deviation of each Time Window at the discretion of the User.
Major shoutout to Dexter and TFO for their Time, it was a pleasure to collaborate and create this tool with them.
GLGT!
Machine Learning Momentum Oscillator [ChartPrime]The Machine Learning Momentum Oscillator brings together the K-Nearest Neighbors (KNN) algorithm and the predictive strength of the Tactical Sector Indicator (TSI) Momentum. This unique oscillator not only uses the insights from TSI Momentum but also taps into the power of machine learning therefore being designed to give traders a more comprehensive view of market momentum.
At its core, the Machine Learning Momentum Oscillator blends TSI Momentum with the capabilities of the KNN algorithm. Introducing KNN logic allows for better handling of noise in the data set. The TSI Momentum is known for understanding how strong trends are and which direction they're headed, and now, with the added layer of machine learning, we're able to offer a deeper perspective on market trends. This is a fairly classical when it comes to visuals and trading.
Green bars show the trader when the asset is in an uptrend. On the flip side, red bars mean things are heading down, signaling a bearish movement driven by selling pressure. These color cues make it easier to catch the sentiment and direction of the market in a glance.
Yellow boxes are also displayed by the oscillator. These boxes highlight potential turning points or peaks. When the market comes close to these points, they can provide a heads-up about the possibility of changes in momentum or even a trend reversal, helping a trader make informed choices quickly. These can be looked at as possible reversal areas simply put.
Settings:
Users can adjust the number of neighbours in the KNN algorithm and choose the periods they prefer for analysis. This way, the tool becomes a part of a trader's strategy, adapting to different market conditions as they see fit. Users can also adjust the smoothing used by the oscillator via the smoothing input.
[blackcat] L1 TradingView Array and Series ConversionsLevel 1
Background
It just so happens that I need some functions that can convert between the Series data type and the Array data type.
Function
Series is a unique data type of TradingView. By operating Series data, the algorithm can be simplified, which is very convenient. However, in high-level languages, Array is a basic data type that provides great flexibility and can be used to develop advanced algorithms. This is why TradingView introduces the Array data type. This script simply demonstrates how to convert between these two data types.
s2a function: Convert a TV series into an array.
a2s function: Convert an array into a TV series
Finally, Courtesy of Electrified for his "Average Lib":
Remarks
Feedbacks are appreciated.
Market Price Order Divergence + Trapped Positions [Pt]█ Introduction
Specifically designed for trading on NYSE, NASDAQ, Dow Jones, and AMEX related instruments like SPY, QQQ, ES, NQ...etc., this innovative tool provides traders with advanced market insights to help them comprehend the market intricacies and make well-informed decisions. Comprising three primary features: Price Order Divergence (POD) Bubbles, Market Order Bubbles, and Trapped Positions/Zones, this tool assists traders in deciphering the nuances of market order flow and trends.
An important point to note is that TradingView doesn't currently provide direct access to market order data, such as buy and sell order flow. Therefore, this tool cleverly leverages TICK index data to estimate the overall market buy and sell strength.
█ Price Order Divergence (POD)
POD serves to detect disparities between the prices of US indices and estimated market orders during regular trading hours (9:30 to 16:00 EST). Bullish divergence indicates that the estimated market order flow is biased towards buy orders, despite bearish price action. In contrast, bearish divergence indicates that the market order flow is biased towards sell orders while the price exhibits bullish action. By default, PODs are visually represented as green bubbles under the candle for bullish divergence and red ones above the candle for bearish divergence. The bubble's size symbolizes the estimated market order strength.
█ Market Order Bubbles (MOB)
During extended or Globex hours, instead of POD, the tool uses Market Order Bubbles (MOB) to estimate market orders using volume data. Sophisticated algorithm is used to distinguish between bullish vs bearish volume. A strong bullish volume represents significant buy orders, whereas a strong bearish volume represents substantial sell orders. By default, MOBs during these hours are shown in blue for bullish and yellow for bearish divergence. Again, the bubble's size symbolizes the estimated market order strength.
█ Trapped Positions/Zones
Trapped positions materialize when PODs or MOBs emerge in trending markets. For example, a bearish divergence during an uptrend suggests significant selling (including shorting), and if the price continues ascending without offering short positions any profit, these positions become 'trapped shorts' and is shown as 'TS' in the zone. The opposite is true for 'trapped longs' or 'TL'.
A price range zone can be delineated from the trapped position candles. If prices revisit these zones, and the prevailing market trend stays bullish, the trapped shorts will probably liquidate near the break-even point to mitigate losses. The same rationale applies to bullish divergence in a downtrend. Therefore, these zone often times represents support / resistance zones.
█ Potential Use Cases
► Trend Confirmation: POD or MOB can confirm the strength of an ongoing trend. For example, during a bullish trend, a plethora of green bubbles or blue MOBs can affirm the trend's solidity.
► Spotting Reversals: Large, isolated POD or MOB bubbles could indicate potential market reversals. For instance, a prominent red bubble or yellow MOB during an uptrend might hint at an impending trend reversal.
► Risk Management: The Trapped Positions/Zones feature could assist in risk management. When prices approach these zones, traders can anticipate potential large market orders impacting price movements.
► Profit Optimization: This tool can aid traders in optimizing profits by identifying when trapped positions are likely to liquidate, thus predicting potential sharp price movements.
Remember, as with any tool, this should be used alongside other market analyses and not as a standalone indicator. Happy trading!
================================================================================================================
█ Settings Overview
◊ Market - available options: NYSE, NASDAQ, Dow Jones, AMEX. This will be displayed
◊ Lookback period- # of bars to lookback for detecting price vs market order divergences
▼ Regular Hour - Price Order Divergence Bubbles
◊ Show Price Order Divergence (POD) Bubbles - toggle on/off for POD bubbles
◊ └ Use Market Order Sentiment only - Shows divergences between price movement and market order sentiment (amount of buying vs selling)
◊ └ Use Market Order Trend Bias - On top of market order sentiment, the indicator also looks at overall market short term trends to determine divergences
◊ └ Use Threshold Min. Threshold - For filtering order size, the lower the threshold, the more sensitive
◊ └ Use Volume Strength - Take volume into consideration as well, only shows divergence when there is strength in volume
▼ Extended Hour - Market Order Bubbles
◊ Show Market Order Bubbles - toggle on/off for MOB. Using volume data to estimate significant market order activities. Bubbles indicate possible large liquidation activities
◊ └ Volume Analysis period - lookback period for volume analysis
◊ └ Volume Strength period - lookback period for volume strength
▼ Trapped Position Zones
◊ Show Potential Traps - toggle on/off for un-activated trapped zones. They are shown as lightly shaded areas of potential traps. These areas will be activated once price hit the activation %
◊ Show Trapped positions (Regular Hours) - toggle on/off for POD trapped zones. By default, trapped shorts are shown in green, trapped tongs are shown in red.
◊ Show Trapped positions (Extended Hours) - toggle on/off for MOB bubbles. By default, trapped shorts are shown in blue, trapped tongs are shown in orange.
◊ └ Activation % - Trapped zones are activated if price goes x% of the potential trapped range in the undesirable direction. Default is 100%
◊ Liquidate display options - options: On first touch, Per touch, Fully liquidated
Trapped zones liquidate display options:
▼ Display
◊ General color settings for bubbles, trapped zones, and label size
◊ Use Emoji for bubbles - fun setting that displays bulls and bears by default. This helps really visualize where the bulls and bears are! 🤣🤣 These emoji can be changed in the style setting.
▼ Trapped Zone Channel
The trapped zone channel represents a continuous channel of the closest activated trapped zone area. This allows for creating alerts for trapped zones, and the plot outputs allows for custom Pinescript integration.
◊ Trapped Zone Channel Buffer % - Adds upper and lower buffer for trapped zone channel
◊ Show Trapped Channel - toggle on/off on trapped zone channels
◊ └ Remove channel changing lines - toggle on/off the transition plot lines when switching to the closest trapped zones
◊ Show Trapped Channel Fill - toogl
▼ Extra
◊ Display settings for chosen market and indicator title
▼ Trend Follower
◊ Show Trend Following Bar Color - toggle trend follower algorithm. This is an experimental trend following algorithm that attempts to detect bullish, neutral and bearish trends.
▼ Outputs
◊ Output Bubbles
Outputs for Bubbles for external interface. These can be used as inputs to your own indicator or strategy Pinescript. For more info, take a look at this TradingView blog:
www.tradingview.com
Bubble type can be chosen within the settings:
Both - Default, output will include both Market Price Order Divergence Bubbles (during Regular Hours) and Market Order Bubbles (during Extended Hours)
POD Only (RTH) - Output will include only Market Price Order Divergence Bubbles; otherwise, output = 0 during Extended Hours
MOB Only (ETH) - Output will include only Market Order Bubbles; otherwise, output = 0 during Regular Hours
Market Order Bubbles output values:
3 = Large size Bullish Bubble
2 = Medium size Bullish Bubble
1 = Small size Bullish Bubble
0 = No Bubble
-1 = Small size Bearish Bubble
-2 = Medium size Bearish Bubble
-3 = Large size Bearish Bubble
MTF Fusion - S/R Trendlines [TradingIndicators]MTF Fusion S/R Trendlines intelligently adapt to whatever timeframe you're trading - dynamically calculating support and resistance trendline levels combined from four appropriate higher timeframes to give you a much broader view of the market and an edge in your trading decisions.
These trendlines are not programmed to repaint - so you can use them in real-time just as they appeared historically.
What is MTF Fusion?
Multi-Timeframe (MTF) Fusion is the process of combining calculations from multiple timeframes higher than the chart's into one 'fused' value or indicator. It is based on the idea that integrating data from higher timeframes can help us to better identify short-term trading opportunities within the context of long-term market trends.
How does it work?
Let's use the context of this indicator, which calculates S/R Trendlines, as an example to explain how MTF Fusion works and how you can perform it yourself.
Step 1: Selecting Higher Timeframes
The first step is to determine the appropriate higher timeframes to use for the fusion calculation. These timeframes should typically be chosen based on their ability to provide meaningful price levels and action which actively affect the price action of the smaller timeframe you're focused on. For example, if you are trading the 5 minute chart, you might select the 15 minute, 30 minute, and hourly timeframe as the higher timeframes you want to fuse in order to give you a more holistic view of the trends and action affecting you on the 5 minute. In this indicator, four higher timeframes are automatically selected depending on the timeframe of the chart it is applied to.
Step 2: Gathering Data and Calculations
Once the higher timeframes are identified, the next step is to calculate the data from these higher timeframes that will be used to calculate your fused values. In this indicator, for example, the values of support and resistance trendlines are calculated for all four higher timeframes.
Step 3: Fusing the Values From Higher Timeframes
The next step is to actually combine the values from these higher timeframes to obtain your 'fused' indicator values. The simplest approach to this is to simply average them. If you have calculated the value of a support trendline from three higher timeframes, you can, for example, calculate your 'multi-timeframe fused trendline' as (HigherTF_Support_Trendline_1 + HigherTF_Support_Trendline_2 + HigherTF_Support_Trendline_3) / 3.0.
Step 4: Visualization and Interpretation
Once the calculations are complete, the resulting fused indicator values are plotted on the chart. These values reflect the fusion of data from the multiple higher timeframes, giving a broader perspective on the market's behavior and potentially valuable insights without the need to manually consider values from each higher timeframe yourself.
What makes this script unique? Why is it closed source?
While the process described above is fairly unique and sounds simple, the truly important key lies in determining which higher timeframes to fuse together, and how to weight their values when calculating the fused end result in such a way that best leverages their relationship for useful TA.
This MTF Fusion indicator employs a smart, adaptive algorithm which automatically selects appropriate higher timeframes to use in fusion calculations depending on the timeframe of the chart it is applied to. It also uses a dynamic algorithm to adjust and weight the lookbacks used for trendline calculations depending on each higher timeframe's relationship to the chart timeframe. These algorithms are based on extensive testing and are the reason behind this script's closed source status.
Included Features
Fusion Support and Resistance Trendlines
Dynamic Multi-Timeframe Trendlines
Breakaway Zone fills to highlight breakouts and breakdowns from the Fusion trendlines
Customizable lookback approach
Pre-built color stylings
Options
Fusion View: Show/hide the Fusion trendlines calculated from multiple higher timeframes
MTF View: Show/hide the trendlines from multiple higher timeframes used to calculate the Fusion trendlines
Breakaway Zones: Show/hide the fill for zones where price breaks away from the Fusion trendlines
Lookback: Select how you want your trendlines to be calculated (longer = long-term trendlines, shorter = short-term trendlines)
Pre-Built Color Styles: Use a pre-built color styling (uncheck to use your own colors)
Manual Color Styles: When pre-built color styles are disabled, use these color inputs to define your own
Discrete Fourier Transform Overlay [wbburgin]The discrete Fourier transform (DFT) overlay uses a discrete Fourier transform algorithm to identify trend direction. This is a simpler interpretation that only uses the magnitude of the first frequency component obtained from the DFT algorithm, but can be useful for visualization purposes. I haven't seen many Fourier scripts on TradingView that actually have the magnitude plotted on the chart (some have lines, for instance, but that makes it difficult to look into the past or to see previous lines).
About the Discrete Fourier Transform
The DFT is a mathematical transformation that decomposes a time-domain signal into its constituent frequency components. By applying the DFT to OHLC data, we can interpret the periodicities and trends present in the market. I've designed the overlay so that you can choose your source for the Fourier transform, as well as the length.
Settings and Configuration
The "Fourier Period" is the transform length of the DFT algorithm. This input indicates the number of data points considered for the DFT calculation. For example, if this input is set to 20, the DFT will be performed on the most recent 20 data points of the input series. The transform length affects the resolution and accuracy of the frequency analysis. A shorter transform length may provide a broader frequency range but with less detail, while a longer transform length can provide finer frequency resolution but may be computationally more intensive (I recommend using under 100 - anything above that might take too much time to load on the platform).
The "Fourier X Series" is the source you want the Fourier transform to be applied to. I have it set in default to the close.
"Kernel Smoothing" is the bar-start of the rational quadratic kernel used to smooth the frequency component. Think of it just like a normal moving average if you are unfamiliar with the concept, it functions similarly to the "length" value of a moving average.
Aggregate Medians [wbburgin]This indicator recursively finds the average of all high/low medians under your chosen length. This can be very, very helpful for analyzing trends where a moving average or a normal median would produce a bunch of false signals.
Settings:
The "Length" setting is the maximum median that you want the algorithm to add into the sum. The "Start at Period" setting is the the minimum median that you want the algorithm to take into account. Starting at a higher period means that the faster, more sensitive medians of lower lengths are not included, and will smooth out your curve.
I haven't seen many recursive algorithms on TradingView so feel free to use this script as inspiration for any of your ideas. In theory, you can essentially replace the median function with any other function - a moving average, a supertrend, or anything else.
The start must be lower than the length, because this is a sum from the start to the length of all medians in between.
Chan Theory - CHANLUN | CZSCChan Theory (CHANLUN) is a technical analysis theory created by Chinese analyst CZSC, primarily applied in the analysis and decision-making of financial markets such as stocks, futures, forex, and crypto.
It is a technical analysis method based on price and time, including candlestick patterns, fractal theory, box theory, trend theory, divergence theory, multiple time frame analysis, and more.
"Chan" means zen, indicating that the fluctuations in the market are rooted in human nature, such as greed, anger, ignorance, slowness, and suspicion.
"Chan" is also the pinyin of the Chinese character '缠', which means entanglement or entwining. as the fluctuations in the stock market were intertwined like a spiral.
Concepts
Fractal - fractal is formed by three candlesticks, with the middle one being the highest for a top fractal and the lowest for a bottom fractal. In Chan Theory, the first step is to traverse all candlesticks to find all valid fractals.
Stroke - stroke is usually composed of multiple fractals, with a top fractal and a bottom fractal at both ends, and the connection between them forms a stroke with clear high and low points. This is the smallest unit of composition in Chan Theory, similar to the zigzag algorithm.
Segment - segment is generated from strokes based on the feature sequence algorithm, and a segment contains at least three strokes. a segment is a higher level of period, indicating the trend of the market at a higher level,similar to period 5M to period 30M.
Box - box is the overlapping area of multiple segments, and a box contains at least three segments. A box represents a densely traded area and a temporary consensus price range,the bull-bear battle has not produced a clear outcome, it means that the market is in a state of uncertainty and that the direction of the trend is unclear.
Trend - In Chan Theory, two or more boxes in the same direction form a trend,If the box position are gradually rising, it is defined as an uptrend,conversely, it is a downtrend.
Differences with ZigZag
Both the Chan Theory Stroke and the ZigZag are formed by connecting the high and low points to create a line. But in Chan Theory, there are strict additional requirements:
There must be at least five candlesticks between the high and low points, Otherwise it does not form a Stroke.
The high and low fractal cannot share the same candlestick,Otherwise it does not form a Stroke.
There must be at least three candlesticks between the high and low fractal,these three candlesticks must move in the same direction.
There may be complex situations where there are multiple top or bottom patterns in a single Stroke, requiring special handling to determine the connection rules for the lines.
Chan Theory is a complex theory that includes not only Stroke, but also other theories such as Box、Recursion and Divergence.
Recursion
The processing flow of the Chan Theory is similar to a ternary algorithm, It organizes chaotic candlestick into an orderly system (Fractal -> Stroke -> Segment -> Box -> Trend),levels gradually increase from small to large. We can let the levels develop continuously to obtain the appropriate level for analysis and trading, In Chan Theory, it is called "recursion". This method allows us to observe the structure of smaller levels to make trading decisions at the current level,and it allows us to combine multiple levels to determine specific trading points.
Divergence
Chan Theory uses MACD to infer the strength of the trend as momentum analysis. Chan Theory calculates the MACD area of the K-line to quantify the strength of a trend, and compares the areas of the front and back two sections of the same level box to determine whether the trend is exhausted,it is called "divergence". this is one of the important part to determine trading points.
缠论是一种技术分析理论,由中国分析师 "缠中说禅"所创立,主要应用于股票、期货、外汇、加密货币等金融市场的分析和决策。
市场哲学和禅
以股市为基础。缠者,价格重叠区间也,买卖双方阵地战之区域也;禅者,破解之道也。以阵地战为
中心,比较前后两段之力度大小,大者,留之,小者,去之。
以现实存在为基础。缠者,人性之纠结,贪嗔疾慢疑也;禅者,觉悟、超脱者也。以禅破缠,上善若
水,尤如空筒,随波而走,方入空门。
技术分析简解
以走势中枢为中间点的力度比较,尤如拔河,力大者,持有原仓位,力小者,反向操作。
把走势全部同级别分解,关注新的走势之形成,以前一走势段为中间点与再前一走势段比大小,大者,
留之,小者,去之。
进行多重赋格性的同级别分解操作,尤如行船、尤如开车,以不同档位适应不同情况
技术分析量化组件
形态学 - 笔、线段、走势中枢、走势类型
动力学 - 背驰、走势中枢、走势的能量结构
壹缠脚本是以缠论为核心理论,实现的技术分析指标系统
功能说明
基于缠论分析 实时笔段走势画线、自动中枢标识、多级别K线递归走势、实时标注缠论三类买卖点
支持配置多种笔、段、走势规则 满足交易者的笔段习惯和风格
支持TradingView警报机制 实时推送各级别买卖点通知到邮箱或Webhook
脚本图例说明
笔段走势 - 蓝线为当前级别K线构成的笔,紫色线为基于笔级别特征序列处理生成的段,紫线为基于当前级别段生成的走势
中枢级别 - 各级别画线、中枢、买卖点提示信息采用同一颜色。即笔级别中枢同为浅蓝色、段级别中枢为橙色。
MACD面积 - 笔段走势的末端数字为对应笔段的MACD面积, 蓝色为笔MACD面积,橙色为段MACD面积,紫色为走势MACD面积。
WaveTrend 3D█ OVERVIEW
WaveTrend 3D (WT3D) is a novel implementation of the famous WaveTrend (WT) indicator and has been completely redesigned from the ground up to address some of the inherent shortcomings associated with the traditional WT algorithm.
█ BACKGROUND
The WaveTrend (WT) indicator has become a widely popular tool for traders in recent years. WT was first ported to PineScript in 2014 by the user @LazyBear, and since then, it has ascended to become one of the Top 5 most popular scripts on TradingView.
The WT algorithm appears to have origins in a lesser-known proprietary algorithm called Trading Channel Index (TCI), created by AIQ Systems in 1986 as an integral part of their commercial software suite, TradingExpert Pro. The software’s reference manual states that “TCI identifies changes in price direction” and is “an adaptation of Donald R. Lambert’s Commodity Channel Index (CCI)”, which was introduced to the world six years earlier in 1980. Interestingly, a vestige of this early beginning can still be seen in the source code of LazyBear’s script, where the final EMA calculation is stored in an intermediate variable called “tci” in the code.
█ IMPLEMENTATION DETAILS
WaveTrend 3D is an alternative implementation of WaveTrend that directly addresses some of the known shortcomings of the indicator, including its unbounded extremes, susceptibility to whipsaw, and lack of insight into other timeframes.
In the canonical WT approach, an exponential moving average (EMA) for a given lookback window is used to assess the variability between price and two other EMAs relative to a second lookback window. Since the difference between the average price and its associated EMA is essentially unbounded, an arbitrary scaling factor of 0.015 is typically applied as a crude form of rescaling but still fails to capture 20-30% of values between the range of -100 to 100. Additionally, the trigger signal for the final EMA (i.e., TCI) crossover-based oscillator is a four-bar simple moving average (SMA), which further contributes to the net lag accumulated by the consecutive EMA calculations in the previous steps.
The core idea behind WT3D is to replace the EMA-based crossover system with modern Digital Signal Processing techniques. By assuming that price action adheres approximately to a Gaussian distribution, it is possible to sidestep the scaling nightmare associated with unbounded price differentials of the original WaveTrend method by focusing instead on the alteration of the underlying Probability Distribution Function (PDF) of the input series. Furthermore, using a signal processing filter such as a Butterworth Filter, we can eliminate the need for consecutive exponential moving averages along with the associated lag they bring.
Ideally, it is convenient to have the resulting probability distribution oscillate between the values of -1 and 1, with the zero line serving as a median. With this objective in mind, it is possible to borrow a common technique from the field of Machine Learning that uses a sigmoid-like activation function to transform our data set of interest. One such function is the hyperbolic tangent function (tanh), which is often used as an activation function in the hidden layers of neural networks due to its unique property of ensuring the values stay between -1 and 1. By taking the first-order derivative of our input series and normalizing it using the quadratic mean, the tanh function performs a high-quality redistribution of the input signal into the desired range of -1 to 1. Finally, using a dual-pole filter such as the Butterworth Filter popularized by John Ehlers, excessive market noise can be filtered out, leaving behind a crisp moving average with minimal lag.
Furthermore, WT3D expands upon the original functionality of WT by providing:
First-class support for multi-timeframe (MTF) analysis
Kernel-based regression for trend reversal confirmation
Various options for signal smoothing and transformation
A unique mode for visualizing an input series as a symmetrical, three-dimensional waveform useful for pattern identification and cycle-related analysis
█ SETTINGS
This is a summary of the settings used in the script listed in roughly the order in which they appear. By default, all default colors are from Google's TensorFlow framework and are considered to be colorblind safe.
Source: The input series. Usually, it is the close or average price, but it can be any series.
Use Mirror: Whether to display a mirror image of the source series; for visualizing the series as a 3D waveform similar to a soundwave.
Use EMA: Whether to use an exponential moving average of the input series.
EMA Length: The length of the exponential moving average.
Use COG: Whether to use the center of gravity of the input series.
COG Length: The length of the center of gravity.
Speed to Emphasize: The target speed to emphasize.
Width: The width of the emphasized line.
Display Kernel Moving Average: Whether to display the kernel moving average of the signal. Like PCA, an unsupervised Machine Learning technique whereby neighboring vectors are projected onto the Principal Component.
Display Kernel Signal: Whether to display the kernel estimator for the emphasized line. Like the Kernel MA, it can show underlying shifts in bias within a more significant trend by the colors reflected on the ribbon itself.
Show Oscillator Lines: Whether to show the oscillator lines.
Offset: The offset of the emphasized oscillator plots.
Fast Length: The length scale factor for the fast oscillator.
Fast Smoothing: The smoothing scale factor for the fast oscillator.
Normal Length: The length scale factor for the normal oscillator.
Normal Smoothing: The smoothing scale factor for the normal frequency.
Slow Length: The length scale factor for the slow oscillator.
Slow Smoothing: The smoothing scale factor for the slow frequency.
Divergence Threshold: The number of bars for the divergence to be considered significant.
Trigger Wave Percent Size: How big the current wave should be relative to the previous wave.
Background Area Transparency Factor: Transparency factor for the background area.
Foreground Area Transparency Factor: Transparency factor for the foreground area.
Background Line Transparency Factor: Transparency factor for the background line.
Foreground Line Transparency Factor: Transparency factor for the foreground line.
Custom Transparency: Transparency of the custom colors.
Total Gradient Steps: The maximum amount of steps supported for a gradient calculation is 256.
Fast Bullish Color: The color of the fast bullish line.
Normal Bullish Color: The color of the normal bullish line.
Slow Bullish Color: The color of the slow bullish line.
Fast Bearish Color: The color of the fast bearish line.
Normal Bearish Color: The color of the normal bearish line.
Slow Bearish Color: The color of the slow bearish line.
Bullish Divergence Signals: The color of the bullish divergence signals.
Bearish Divergence Signals: The color of the bearish divergence signals.
█ ACKNOWLEDGEMENTS
@LazyBear - For authoring the original WaveTrend port on TradingView
@PineCoders - For the beautiful color gradient framework used in this indicator
@veryfid - For the inspiration of using mirrored signals for cycle analysis and using multiple lookback windows as proxies for other timeframes
Improved Chaikin Money FlowChaikin Money Flow is a well-known Indicator for gauging buying/selling pressure. Marc Chaikin intended this to be used on the daily timeframe to capture the behavior of price action at or near the daily close when larger-scale actors influence the market. The calculation is straight forward as described within the built-in TradingView "CMF" indicator:
1. Period Money Flow Multiplier = ((Close - Low) - (High - Close)) /(High - Low)
2. Period Money Flow Volume = Period Money Flow Multiplier x Volume for the Period
3. Chaikin Money Flow = 21 Period Sum of Money Flow Volume / 21 Period Sum of Volume
There is, however, a problem with this algorithm: it does not account for daily gaps in price action. This leads to the indicator sometimes moving out-of-sync with price action and/or an under-emphasis of the magnitude change of the indicator relative to the change in price action. This is a significant problem for someone trying to read divergences against an underlying.
Note: I have never seen a published attempt to improve this indicator which is why I decided that there had to be a way to do it.
In order to mitigate this issue, I have taken the basic script provided by TradingView and made a key modification. If the open of a candle is outside the range of the previous candle, then the close of the previous candle is used as the "high" for the current candle (in the case of a gap down) or the "low" for the current candle (in the case of a gap up). However, if the close of the current candle exceeds the previous close, highs and lows for the current candle are calculated as normal. I believe this accounts for gaps in price action without significantly altering the original intent of the indicator.
I have made four other minor tweaks:
1. Default style is color coded area above and below the Zero Line
2. Range scaled to +/-100 instead of +/-1 (displays better on graph)
3. Set timeframe to Daily (as that is the timeframe for which this indicator was intended by Chaikin)
4. Length defaults to 21 (which is what Chaikin uses)
Extreme Trend Reversal Points [HeWhoMustNotBeNamed]Using moving average crossover for identifying the change in trend is very common. However, this method can give lots of false signals during the ranging markets. In this algorithm, we try to find the extreme trend by looking at fully aligned multi-level moving averages and only look at moving average crossover when market is in the extreme trend - either bullish or bearish. These points can mean long term downtrend or can also cause a small pullback before trend continuation. In this discussion, we will also check how to handle different scenarios.
🎲 Components
🎯 Recursive Multi Level Moving Averages
Multi level moving average here refers to applying moving average on top of base moving average on multiple levels. For example,
Level 1 SMA = SMA(source, length)
Level 2 SMA = SMA(Level 1 SMA, length)
Level 3 SMA = SMA(Level 2 SMA, length)
..
..
..
Level n SMA = SMA(Level (n-1) SMA, length)
In this script, user can select how many levels of moving averages need to be calculated. This is achieved through " recursive moving average " algorithm. Requirement for building such algorithm was initially raised by @loxx
While I was able to develop them in minimal code with the help of some of the existing libraries built on arrays and matrix , I also thought why not extend this to find something interesting.
Note that since we are using variable levels - we will not be able to plot all the levels of moving average. (This is because plotting cannot be done in the loop). Hence, we are using lines to display the latest moving average levels in front of the last candle. Lines are color coded in such a way that least numbered levels are greener and higher levels are redder.
🎯 Finding the trend and range
Strength of fully aligned moving average is calculated based on position of each level with respect to other levels.
For example, in a complete uptrend, we can find
source > L(1)MA > L(2)MA > L(3)MA ...... > L(n-1)MA > L(n)MA
Similarly in a complete downtrend, we can find
source < L(1)MA < L(2)MA < L(3)MA ...... < L(n-1)MA < L(n)MA
Hence, the strength of trend here is calculated based on relative positions of each levels. Due to this, value of strength can range from 0 to Level*(Level-1)/2
0 represents the complete downtrend
Level*(Level-1)/2 represents the complete uptrend.
Range and Extreme Range are calculated based on the percentile from median. The brackets are defined as per input parameters - Range Percentile and Extreme Range Percentile by using Percentile History as reference length.
Moving average plot is color coded to display the trend strength.
Green - Extreme Bullish
Lime - Bullish
Silver - range
Orange - Bearish
Red - Extreme Bearish
🎯 Finding the trend reversal
Possible trend reversals are when price crosses the moving average while in complete trend with all the moving averages fully aligned. Triangle marks are placed in such locations which can help observe the probable trend reversal points. But, there are possibilities of trend overriding these levels. An example of such thing, we can see here:
In order to overcome this problem, we can employ few techniques.
1. After the signal, wait for trend reversal (moving average plot color to turn silver) before placing your order.
2. Place stop orders on immediate pivot levels or support resistance points instead of opening market order. This way, we can also place an order in the direction of trend. Whichever side the price breaks out, will be the direction to trade.
3. Look for other confirmations such as extremely bullish and bearish candles before placing the orders.
🎯 An example of using stop orders
Let us take this scenario where there is a signal on possible reversal from complete uptrend.
Create a box joining high and low pivots at reasonable distance. You can also chose to add 1 ATR additional distance from pivots.
Use the top of the box as stop-entry for long and bottom as stop-entry for short. The other ends of the box can become stop-losses for each side.
After few bars, we can see that few more signals are plotted but, the price is still within the box. There are some candles which touched the top of the box. But, the candlestick patterns did not represent bullishness on those instances. If you have placed stop orders, these orders would have already filled in. In that case, just wait for position to hit either stop or target.
For bullish side, targets can be placed at certain risk reward levels. In this case, we just use 1:1 for bullish (trend side) and 1:1.5 for bearish side (reversal side)
In this case, price hit the target without any issue:
Wait for next reversal signal to appear before placing another order :)
American Approximation Bjerksund & Stensland 2002 [Loxx]American Approximation Bjerksund & Stensland 2002 is an American Options pricing model. This indicator also includes numerical greeks. You can compare the output of the American Approximation to the Black-Scholes-Merton value on the output of the options panel.
The Bjerksund & Stensland (2002) Approximation
The Bjerksund and Stensland (2002) approximation divides the time to maturity into two parts, each with a separate flat exercise boundary. It is thus a straightforward generalization of the Bjerksund-Stensland 1993 algorithm. The method is fast and efficient and should be more accurate than the Barone-Adesi and Whaley (1987) and the Bjerksund and Stensland (1993b) approximations. The algorithm requires an accurate cumulative bivariate normal approximation. Several approximations that are described in the literature are not sufficiently accurate, but the Genze algorithm works.
C = alpha2*S^B - alpha2*phi(S, t1, B, I2, I2)
+ phi(S, t1, I2, I2) - phi(S, t1, I, I1, I2)
- X*phi(S, t1, 0, I2, I2) + X*phi(S, t1, 0, I1, I2)
+ alpha1*phi(X, t1, B, I1, I2) - alpha1*psi*St, T, B, I1, I2, I1, t1)
+ psi(S, T, 1, I1, I2, I1, t1) - psi(S, T, 1, X, I2, I1, t1)
- X*psi(S, T, 0, I1, I2, I1, t1) + psi(S, T, 0 ,X, I2, I1, t1)
where
alpha1 = (I1 - X)*I1^-B
alpha2 = (I2 - X)*I2^-B
B = (1/2 - b/v^2) + ((b/v^2 - 1/2)^2 + 2*(r/v^2))^0.5
The function psi(S, T, y, H, I) is given by
psi(S, T, gamma, H, I) = e^lambda * S^gamma * (N(-d) - (I/S)^k * N(-d2))
d = (log(S/H) + (b + (gamma - 1/2) * v^2) * T) / (v * T^0.5)
d2 = (log(I^2/(S*H)) + (b + (gamma - 1/2) * v^2) * T) / (v * T^0.5)
lambda = -r + gamma * b + 1/2 * gamma * (gamma - 1) * v^2
k = 2*b/v^2 + (2 * gamma - 1)
and the trigger price I is defined as
I1 = B0 + (B(+infi) - B0) * (1 - e^h1)
I2 = B0 + (B(+infi) - B0) * (1 - e^h2)
h1 = -(b*t1 + 2*v*t1^0.5) * (X^2 / ((B(+infi) - B0))*B0)
h2 = -(b*T + 2*v*T^0.5) * (X^2 / ((B(+infi) - B0))*B0)
t1 = 1/2 * (5^0.5 - 1) * T
B(+infi) = (B / (B - 1)) * X
B0 = max(X, (r / (r - b)) * X)
Moreover, the function psi(S, T, gamma, H, I2, I1, t1) is given by
psi(S, T, gamma, H, I2, I1, t1, r, b, v) = e^(lambda * T) * S^gamma * (M(-e1, -f1, rho) - (I2/S)^k * M(-e2, -f2, rho)
- (I1/S)^k * M(-e3, -f3, -rho) + (I1/I2)^k * M(-e4, -f4, -rho))
where (see screenshot for e and f values)
b=r options on non-dividend paying stock
b=r-q options on stock or index paying a dividend yield of q
b=0 options on futures
b=r-rf currency options (where rf is the rate in the second currency)
Inputs
S = Stock price.
K = Strike price of option.
T = Time to expiration in years.
r = Risk-free rate
c = Cost of Carry
V = Variance of the underlying asset price
cnd1(x) = Cumulative Normal Distribution
cbnd3(x) = Cumulative Bivariate Normal Distribution
nd(x) = Standard Normal Density Function
convertingToCCRate(r, cmp ) = Rate compounder
Numerical Greeks or Greeks by Finite Difference
Analytical Greeks are the standard approach to estimating Delta, Gamma etc... That is what we typically use when we can derive from closed form solutions. Normally, these are well-defined and available in text books. Previously, we relied on closed form solutions for the call or put formulae differentiated with respect to the Black Scholes parameters. When Greeks formulae are difficult to develop or tease out, we can alternatively employ numerical Greeks - sometimes referred to finite difference approximations. A key advantage of numerical Greeks relates to their estimation independent of deriving mathematical Greeks. This could be important when we examine American options where there may not technically exist an exact closed form solution that is straightforward to work with. (via VinegarHill FinanceLabs)
Things to know
Only works on the daily timeframe and for the current source price.
You can adjust the text size to fit the screen
Nadaraya-Watson CombineThis is a combination of the Lux Algo Nadaraya-Watson Estimator and Envelope. Please note the repainting issue.
In addition, I've added a plot of the actual values of the current barstate of
the Nadaraya-Watson windows as they are computed (lines 92-95). It only plots values for the current data at
each time update. It is interesting to compare the trajectory of the end points of the Estimator and
Envelope to the smoothing function at each time update. Due to the kernel smoothing at each update the
history is lost at each update (repaint).
I've added a feature to allow adjustment to the kernel smoothing algorithm as suggested by thomsonraja (line 59).
The settings and usage are repeated from Lux Algo below.
Settings
Window Size: Determines the number of recent price observations to be used to fit the Nadaraya-Watson Estimator.
Bandwidth: Controls the degree of smoothness of the envelopes , with higher values returning smoother results.
Mult: Controls the envelope width.
Src: Input source of the indicator.
Kernel power: See line 59, adjusts the exponential power (powh) as suggested by thomsonraja
Kernel denominator: See line 59, adjusts the denominator (den) as suggested by thomsonraja
Usage
This tool outlines extremes made by the prices within the selected window size.
This is achieved by estimating the underlying trend in the price using kernel smoothing,
calculating the mean absolute deviations from it, and adding/subtracting it
from the estimated underlying trend.
I repeat Lux Algo's caution: 'we do not recommend this tool to be used alone
or solely for real time applications.'
End-Pointed SSA of Normalized Price Corridor [Loxx]End-Pointed SSA of Normalized Price Corridor is an end-pointed SSA of normalized input price to output a smoothed normalized oscillator of price. Corridors are added in attempt to decipher larger trend direction of price. These corridor trend lines are based on highs and lows of price. Due to the SSA algorithm, this indicator takes some time load on the chat, so be patient. You can adjust the lag parameter downward to speed up the indicator load time but this will also degrade the signal. There are many different ways to use this indicator. It is also Renko chart friendly.
An example of emerging trends (these do not repaint)
What is Singular Spectrum Analysis ( SSA )?
Singular spectrum analysis ( SSA ) is a technique of time series analysis and forecasting. It combines elements of classical time series analysis, multivariate statistics, multivariate geometry, dynamical systems and signal processing. SSA aims at decomposing the original series into a sum of a small number of interpretable components such as a slowly varying trend, oscillatory components and a ‘structureless’ noise. It is based on the singular value decomposition ( SVD ) of a specific matrix constructed upon the time series. Neither a parametric model nor stationarity-type conditions have to be assumed for the time series. This makes SSA a model-free method and hence enables SSA to have a very wide range of applicability.
For our purposes here, we are only concerned with the "Caterpillar" SSA . This methodology was developed in the former Soviet Union independently (the ‘iron curtain effect’) of the mainstream SSA . The main difference between the main-stream SSA and the "Caterpillar" SSA is not in the algorithmic details but rather in the assumptions and in the emphasis in the study of SSA properties. To apply the mainstream SSA , one often needs to assume some kind of stationarity of the time series and think in terms of the "signal plus noise" model (where the noise is often assumed to be ‘red’). In the "Caterpillar" SSA , the main methodological stress is on separability (of one component of the series from another one) and neither the assumption of stationarity nor the model in the form "signal plus noise" are required.
"Caterpillar" SSA
The basic "Caterpillar" SSA algorithm for analyzing one-dimensional time series consists of:
Transformation of the one-dimensional time series to the trajectory matrix by means of a delay procedure (this gives the name to the whole technique);
Singular Value Decomposition of the trajectory matrix;
Reconstruction of the original time series based on a number of selected eigenvectors.
This decomposition initializes forecasting procedures for both the original time series and its components. The method can be naturally extended to multidimensional time series and to image processing.
The method is a powerful and useful tool of time series analysis in meteorology, hydrology, geophysics, climatology and, according to our experience, in economics, biology, physics, medicine and other sciences; that is, where short and long, one-dimensional and multidimensional, stationary and non-stationary, almost deterministic and noisy time series are to be analyzed.
Included
Bar coloring
Signals
Alerts
Loxx's Expanded Source Types
End-pointed SSA of Williams %R [Loxx]End-pointed SSA of Williams %R is an indicator that runes Williams %R SSA calculation through a Singular Spectrum Analysis (SSA) algorithm to derive a smoother final output. The reduction in noise from the traditional Williams %R is significant.
What is Williams %R?
Williams %R , also known as the Williams Percent Range, is a type of momentum indicator that moves between 0 and -100 and measures overbought and oversold levels. The Williams %R may be used to find entry and exit points in the market. The indicator is very similar to the Stochastic oscillator and is used in the same way. It was developed by Larry Williams and it compares a stock’s closing price to the high-low range over a specific period, typically 14 days or periods.
What is Singular Spectrum Analysis ( SSA )?
Singular spectrum analysis ( SSA ) is a technique of time series analysis and forecasting. It combines elements of classical time series analysis, multivariate statistics, multivariate geometry, dynamical systems and signal processing. SSA aims at decomposing the original series into a sum of a small number of interpretable components such as a slowly varying trend, oscillatory components and a ‘structureless’ noise. It is based on the singular value decomposition ( SVD ) of a specific matrix constructed upon the time series. Neither a parametric model nor stationarity-type conditions have to be assumed for the time series. This makes SSA a model-free method and hence enables SSA to have a very wide range of applicability.
For our purposes here, we are only concerned with the "Caterpillar" SSA . This methodology was developed in the former Soviet Union independently (the ‘iron curtain effect’) of the mainstream SSA . The main difference between the main-stream SSA and the "Caterpillar" SSA is not in the algorithmic details but rather in the assumptions and in the emphasis in the study of SSA properties. To apply the mainstream SSA , one often needs to assume some kind of stationarity of the time series and think in terms of the "signal plus noise" model (where the noise is often assumed to be ‘red’). In the "Caterpillar" SSA , the main methodological stress is on separability (of one component of the series from another one) and neither the assumption of stationarity nor the model in the form "signal plus noise" are required.
"Caterpillar" SSA
The basic "Caterpillar" SSA algorithm for analyzing one-dimensional time series consists of:
Transformation of the one-dimensional time series to the trajectory matrix by means of a delay procedure (this gives the name to the whole technique);
Singular Value Decomposition of the trajectory matrix;
Reconstruction of the original time series based on a number of selected eigenvectors.
This decomposition initializes forecasting procedures for both the original time series and its components. The method can be naturally extended to multidimensional time series and to image processing.
The method is a powerful and useful tool of time series analysis in meteorology, hydrology, geophysics, climatology and, according to our experience, in economics, biology, physics, medicine and other sciences; that is, where short and long, one-dimensional and multidimensional, stationary and non-stationary, almost deterministic and noisy time series are to be analyzed.
Included:
Bar coloring
[*Alerts
[*Signals
[*Loxx's Expanded Source Types
Related Williams %R Indicators
Williams %R on Chart w/ Dynamic Zones
Williams %R w/ Bollinger Bands
Intermediate Williams %R w/ Discontinued Signal Lines
Related SSA Indicators
End-pointed SSA of FDASMA
End-pointed SSA of Normalized Price Oscillator
SSA of Price [Loxx]SSA of Price ris an indicator that runs an SSA calculation on price to derive final output. This indicator also serves to introduce the concept of SSA to the Pine Coder community. The data returned from this algorithm is an array of modeled values on past X bars. Unlike the end-pointed SSA posted previously, this version pulls the modeled data from the output array and draws a line backward from the current bar. This indicator recalculates so past observations aren't very useful, but the current observation is since the current bar is index 0 of the output array which means it's the endpointed value.
What is Singular Spectrum Analysis ( SSA )?
Singular spectrum analysis ( SSA ) is a technique of time series analysis and forecasting. It combines elements of classical time series analysis, multivariate statistics, multivariate geometry, dynamical systems and signal processing. SSA aims at decomposing the original series into a sum of a small number of interpretable components such as a slowly varying trend, oscillatory components and a ‘structureless’ noise. It is based on the singular value decomposition ( SVD ) of a specific matrix constructed upon the time series. Neither a parametric model nor stationarity-type conditions have to be assumed for the time series. This makes SSA a model-free method and hence enables SSA to have a very wide range of applicability.
For our purposes here, we are only concerned with the "Caterpillar" SSA . This methodology was developed in the former Soviet Union independently (the ‘iron curtain effect’) of the mainstream SSA . The main difference between the main-stream SSA and the "Caterpillar" SSA is not in the algorithmic details but rather in the assumptions and in the emphasis in the study of SSA properties. To apply the mainstream SSA , one often needs to assume some kind of stationarity of the time series and think in terms of the "signal plus noise" model (where the noise is often assumed to be ‘red’). In the "Caterpillar" SSA , the main methodological stress is on separability (of one component of the series from another one) and neither the assumption of stationarity nor the model in the form "signal plus noise" are required.
"Caterpillar" SSA
The basic "Caterpillar" SSA algorithm for analyzing one-dimensional time series consists of:
Transformation of the one-dimensional time series to the trajectory matrix by means of a delay procedure (this gives the name to the whole technique);
Singular Value Decomposition of the trajectory matrix;
Reconstruction of the original time series based on a number of selected eigenvectors.
This decomposition initializes forecasting procedures for both the original time series and its components. The method can be naturally extended to multidimensional time series and to image processing.
The method is a powerful and useful tool of time series analysis in meteorology, hydrology, geophysics, climatology and, according to our experience, in economics, biology, physics, medicine and other sciences; that is, where short and long, one-dimensional and multidimensional, stationary and non-stationary, almost deterministic and noisy time series are to be analyzed.
Included:
Bar coloring
Alerts
Signals
Loxx's Expanded Source Types
End-pointed SSA of Normalized Price Oscillator [Loxx]End-pointed SSA of Normalized Price Oscillator is an indicator that converts source price into a normalized oscillator and runs an SSA calculation to derived a smoother final output. This indicator also serves to introduce the concept of SSA to the Pine Coder community. The data returned from this algorithm is an array of modeled values on past X bars. We could use this data but it's not useful, so instead we use the end-pointed value which is the first value of the array at index 0.
What is Singular Spectrum Analysis (SSA)?
Singular spectrum analysis (SSA) is a technique of time series analysis and forecasting. It combines elements of classical time series analysis, multivariate statistics, multivariate geometry, dynamical systems and signal processing. SSA aims at decomposing the original series into a sum of a small number of interpretable components such as a slowly varying trend, oscillatory components and a ‘structureless’ noise. It is based on the singular value decomposition (SVD) of a specific matrix constructed upon the time series. Neither a parametric model nor stationarity-type conditions have to be assumed for the time series. This makes SSA a model-free method and hence enables SSA to have a very wide range of applicability.
For our purposes here, we are only concerned with the "Caterpillar" SSA. This methodology was developed in the former Soviet Union independently (the ‘iron curtain effect’) of the mainstream SSA. The main difference between the main-stream SSA and the "Caterpillar" SSA is not in the algorithmic details but rather in the assumptions and in the emphasis in the study of SSA properties. To apply the mainstream SSA, one often needs to assume some kind of stationarity of the time series and think in terms of the "signal plus noise" model (where the noise is often assumed to be ‘red’). In the "Caterpillar" SSA, the main methodological stress is on separability (of one component of the series from another one) and neither the assumption of stationarity nor the model in the form "signal plus noise" are required.
"Caterpillar" SSA
The basic "Caterpillar" SSA algorithm for analyzing one-dimensional time series consists of:
Transformation of the one-dimensional time series to the trajectory matrix by means of a delay procedure (this gives the name to the whole technique);
Singular Value Decomposition of the trajectory matrix;
Reconstruction of the original time series based on a number of selected eigenvectors.
This decomposition initializes forecasting procedures for both the original time series and its components. The method can be naturally extended to multidimensional time series and to image processing.
The method is a powerful and useful tool of time series analysis in meteorology, hydrology, geophysics, climatology and, according to our experience, in economics, biology, physics, medicine and other sciences; that is, where short and long, one-dimensional and multidimensional, stationary and non-stationary, almost deterministic and noisy time series are to be analyzed.
Included:
Bar coloring
Alerts
Signals
Loxx's Expanded Source Types
Itakura-Saito Autoregressive Extrapolation of Price [Loxx]Itakura-Saito Autoregressive Extrapolation of Price is an indicator that uses an autoregressive analysis to predict future prices. This is a linear technique that was originally derived or speech analysis algorithms.
What is Itakura-Saito Autoregressive Analysis?
The technique of linear prediction has been available for speech analysis since the late 1960s (Itakura & Saito, 1973a, 1970; Atal & Hanauer, 1971), although the basic principles were established long before this by Wiener (1947). Linear predictive coding, which is also known as autoregressive analysis, is a time-series algorithm that has applications in many fields other than speech analysis (see, e.g., Chatfield, 1989).
Itakura and Saito developed a formulation for linear prediction analysis using a lattice form for the inverse filter. The Itakura–Saito distance (or Itakura–Saito divergence) is a measure of the difference between an original spectrum and an approximation of that spectrum. Although it is not a perceptual measure it is intended to reflect perceptual (dis)similarity. It was proposed by Fumitada Itakura and Shuzo Saito in the 1960s while they were with NTT. The distance is defined as: The Itakura–Saito distance is a Bregman divergence, but is not a true metric since it is not symmetric and it does not fulfil triangle inequality.
read more: Selected Methods for Improving Synthesis Speech Quality Using Linear Predictive Coding: System Description, Coefficient Smoothing and Streak
Data inputs
Source Settings: -Loxx's Expanded Source Types. You typically use "open" since open has already closed on the current active bar
LastBar - bar where to start the prediction
PastBars - how many bars back to model
LPOrder - order of linear prediction model; 0 to 1
FutBars - how many bars you want to forward predict
Things to know
Normally, a simple moving average is calculated on source data. I've expanded this to 38 different averaging methods using Loxx's Moving Avreages.
This indicator repaints
Related Indicators (linear extrapolation of price)
Levinson-Durbin Autocorrelation Extrapolation of Price
Weighted Burg AR Spectral Estimate Extrapolation of Price
Helme-Nikias Weighted Burg AR-SE Extra. of Price
APA Adaptive Fisher Transform [Loxx]APA Adaptive Fisher Transform is an adaptive cycle Fisher Transform using Ehlers Autocorrelation Periodogram Algorithm to calculate the dominant cycle period.
What is an adaptive cycle, and what is Ehlers Autocorrelation Periodogram Algorithm?
From Ehlers' book Cycle Analytics for Traders Advanced Technical Trading Concepts by John F. Ehlers , 2013, page 135:
"Adaptive filters can have several different meanings. For example, Perry Kaufman’s adaptive moving average ( KAMA ) and Tushar Chande’s variable index dynamic average ( VIDYA ) adapt to changes in volatility . By definition, these filters are reactive to price changes, and therefore they close the barn door after the horse is gone.The adaptive filters discussed in this chapter are the familiar Stochastic , relative strength index ( RSI ), commodity channel index ( CCI ), and band-pass filter.The key parameter in each case is the look-back period used to calculate the indicator. This look-back period is commonly a fixed value. However, since the measured cycle period is changing, it makes sense to adapt these indicators to the measured cycle period. When tradable market cycles are observed, they tend to persist for a short while.Therefore, by tuning the indicators to the measure cycle period they are optimized for current conditions and can even have predictive characteristics.
The dominant cycle period is measured using the Autocorrelation Periodogram Algorithm. That dominant cycle dynamically sets the look-back period for the indicators. I employ my own streamlined computation for the indicators that provide smoother and easier to interpret outputs than traditional methods. Further, the indicator codes have been modified to remove the effects of spectral dilation.This basically creates a whole new set of indicators for your trading arsenal."
What is Fisher Transform?
The Fisher Transform is a technical indicator created by John F. Ehlers that converts prices into a Gaussian normal distribution.
The indicator highlights when prices have moved to an extreme, based on recent prices. This may help in spotting turning points in the price of an asset. It also helps show the trend and isolate the price waves within a trend.
Included:
Zero-line and signal cross options for bar coloring
Customizable overbought/oversold thresh-holds
Alerts
Signals