locked
RSI - Relative Strength Index - formula to PowerPivot RRS feed

  • Question

  • Hello All

    We´re trying to add the RSI ( Relative Strength Index ) formula to  monthly data within PowerPivot.

    Most of the work is done in the sample below:

    Relative Strength Index Sample


    But it complicates quickly when recursion is needed.

    We already have the datamodel with the calendar, some sample data and the first working formulas in the sample.
    The rest of the needed formulas are in the excel spreadsheet, but we need them within PowerPivot.

    Can you please take a look and help?
    Friday, April 10, 2015 7:58 PM

Answers

  • Hi Nico,

    I’ve integrated the code in your file:

    ExternalMonthAllInOne.xlsx

    But I doubt you will be happy with the performance of this solution (your 50k-dataset took 20 minutes – my guess for 200k will be nearly 2 hours. Reason are the many unique SYMBOLs).

    Also adjusted the 3-step-files. They perform so much better (every query on your dataset well under a minute!).

    Step1: ExternalMonthly.xlsx

    Step2: TN_RSI_PQ_Step2.xlsx

    Step3: TN_RSI_PQ_Step3.xlsx

    You just need to adjust the path of the source files in Step2 and Step3:

    In Step2 edit query: Step1.

    Applied step “Source” will be selected already.

    Easiest way to adjust the path of your source query is to klick on the wheel (2) the following dialog will pop up:

    Click Browse and choose the folder, where the new files are in, doublecklick the new file:

    Same for step1 and step2 in Step3.xlsx and you’re done – forever J

    Enjoy the speed – it’s a difference between day and night!


    Imke

    • Marked as answer by NicoPer Sunday, May 3, 2015 8:24 PM
    Sunday, May 3, 2015 11:42 AM
    Answerer

All replies

  • I just took a quick look at your model and would have several questions:

    1) why do you have two different tables with data - whats the difference between 'MonthlyQuery' and 'MonthlyData'?
    2) you also have two measures [Sum of CLOSE] and [Sum of CLOSE 2] referring to columns in the tables from above, but both measures are used in your calculations randomly - was this on purpose?
    3) your data is not always mapped to the last day of the month - why is that?
    4) for your AvgGain/AvgLoss calculation you have two different formulas - how to distinguish which one to use?

    -gerhard


    Gerhard Brueckl
    blogging @ http://blog.gbrueckl.at
    working @ http://www.pmOne.com

    Monday, April 13, 2015 8:45 AM
    Answerer
  • Hi Gerhard and thank you very much for stepping in.

    Replying:

    1) The "MonthlyQuery" is an Access query that gets the Monthly data from the daily End Of Day (Open High Low Close Volume) of thousands of symbols.
    Apart of that we need to keep the data updated, so we get the current day ("Daily" query) and we use "MonthlyData" to have both ("MonthlyQuery" and "Daily") appended. (is a PowerQuery "Append" query) 

    2) This was an error. I guess that it´s because I had 2 documents opened at the same time using the same name and excel wouldn´t allowed it.
    We can ignore that. The measures should use [Sum of CLOSE].

    3) The Access query gets the last trading day of the month.
    The day isn´t really necessary here. We could use the months only.

    4) This is the problem, the second one is based in the previous (the average of the last 14 days of gain/loss). This is the recursion I was talking about.

    Do you have any idea on how we can solve this?

    Please ask if you have more questions.
    I appreciate your suggestions and will update the sample file as we go along.


    • Edited by NicoPer Wednesday, April 15, 2015 9:12 PM
    Monday, April 13, 2015 2:44 PM
  • well, in general DAX does not support recursive calculations 

    However, during my investigations into your problem I came up with a solution for recursive calculations in general which I am going to publish a blog-post about on Monday 

    unfortunately this approach does not work in your scenario as you do a more complex calculation in your recursion than a simple multiplication :(

    just drop me a mail if you want to take a look on the blog-post beforehand and check if you can make use of it anyway 

    -gerhard


    Gerhard Brueckl
    blogging @ http://blog.gbrueckl.at
    working @ http://www.pmOne.com

    Friday, April 17, 2015 12:01 PM
    Answerer
  • Hi Nico,

    try this Power Query code - doesn't this solve your problem?

    let
        Quelle = Excel.CurrentWorkbook(){[Name="MonthlyData_"]}[Content],
        SortRows = Table.Sort(Quelle,{{"SYMBOL", Order.Ascending}, {"DATEA", Order.Ascending}}),
        AddIndex0 = Table.AddIndexColumn(SortRows, "Index", 0, 1),
        AddIndex1 = Table.AddIndexColumn(AddIndex0, "Index.1", 1, 1),
        JoinPrevRow = Table.NestedJoin(AddIndex1,{"Index"},AddIndex1,{"Index.1"},"NewColumn"),
        ExpandFields = Table.ExpandTableColumn(JoinPrevRow, "NewColumn", {"SYMBOL", "DATEA", "CLOSE", "MVOLUME"}, {"PrevLine.SYMBOL", "PrevLine.DATEA", "PrevLine.CLOSE", "PrevLine.MVOLUME"}),
        NetChg_ = Table.AddColumn(ExpandFields, "NetChg", each if [SYMBOL]=[PrevLine.SYMBOL] then [CLOSE]-[PrevLine.CLOSE] else null),
        Gain_ = Table.AddColumn(NetChg_, "Gain", each if [NetChg] > 0 then [NetChg] else 0),
        RemoveErrors = Table.RemoveRowsWithErrors(Gain_, {"Gain"}),
        Loss_ = Table.AddColumn(RemoveErrors, "Loss", each if [NetChg]<=0 then -[NetChg] else 0),
        RemoveErrors2 = Table.RemoveRowsWithErrors(Loss_, {"Gain"}),
        ListGain = List.Buffer(RemoveErrors2[Gain]),
        ListIndex = List.Buffer(RemoveErrors2[Index]),
        ListLoss = List.Buffer(RemoveErrors2[Loss]),
        Prev14Index = Table.AddIndexColumn(RemoveErrors2, "Prev14IndexMatch", 15, 1),
        NewIndex = Table.AddIndexColumn(Prev14Index, "NewIndex", 1, 1),
        JoinPrev14 = Table.NestedJoin(NewIndex,{"NewIndex"},NewIndex,{"Prev14IndexMatch"},"NewColumn"),
        Prev14Symbol = Table.ExpandTableColumn(JoinPrev14, "NewColumn", {"SYMBOL"}, {"Prev14.SYMBOL"}),
        AvgGain_ = Table.AddColumn(Prev14Symbol, "AvgGain", each if [SYMBOL]=[Prev14.SYMBOL] then (List.Sum(List.FirstN(ListGain,[NewIndex])) - List.Sum(List.FirstN(ListGain,[NewIndex]-14)))/14 else 0),
        AvgLoss_ = Table.AddColumn(AvgGain_, "AvgLoss", each if [SYMBOL]=[Prev14.SYMBOL] then (List.Sum(List.FirstN(ListLoss,[NewIndex])) - List.Sum(List.FirstN(ListLoss,[NewIndex]-14)))/14 else 0),
        RS_ = Table.AddColumn(AvgLoss_, "RS", each [AvgGain]/[AvgLoss]),
        #"14DayRSI" = Table.AddColumn(RS_, "14-Day RSI", each if [AvgLoss]=0 then 100 else 100-(100/(1+[RS])))
    in
        #"14DayRSI"


    Imke

    • Proposed as answer by Michael Amadi Saturday, April 18, 2015 8:49 AM
    Saturday, April 18, 2015 6:50 AM
    Answerer
  • Thanks Gerhard, I´m checking your blog.
    I think the Imke approach using Powerquery could be what we´re looking for.  

    Hi Imke :)

    Awesome, thanks so much. Another rabbit from the hat..
    I didn´t know such things could be done in PowerQuery.

    I´ve added and relate it to the model and it´s in the file below:
    RSI V2

    While I don´t understand exactly how it works (don´t know how to code), I see that you built a sort of complex (for me at least) index match formulas to go around the recursion problem within powerquery.

    The query was asking for a table so I load it in the worksheet too.

    I put the data in different tables to compare with the excel results, and the RSI numbers are somewhat close but they don´t really match.

    Have looked (with my limited knowledge) but couldn´t find where the error could be.

    Where do you think it could be?
    Saturday, April 18, 2015 8:31 PM
  • Hi Nico,

    my values are different because I used the „standard“-approach on RSI: last 14-days average on gains and losses each, divided by each other and “indexed” like you described.

    In your approach, you don’t use a pure 14-day average, but a value that always drags some part of your very first values with it. For my understanding, the “recursion” brings errors in your calculation.

    That’s why I used the PQ-approach, that way you can reconcile more easily, how the values develop J - sorry, should have been clearer on that.

    At the end, you can switch back to PowerPivot, because now you have quite straightforward requirements.

    But on the other hand, maybe the PQ-approach would be benefitial for your model in terms of performance under certain conditions. Interesting aspect here is the technique from Bill Szysz (https://social.technet.microsoft.com/Forums/en-US/b53dd3b5-0127-435a-82d2-fd8b8839d11f/mutable-value-variable?forum=powerquery#c6b0aab2-102a-453c-a6ae-6ac9f72c77d5 ) that reduces the amount of columns to be iterated through, using List.buffer. Maybe he has also some ideas on how to improve my approach to grab the 14-days values. I subtracted the cumulative value up 14 days before from the cumulative value up to date. If one could come up with a technique that just grabs the 14 days…

    So if you have lots of complicated values to calculate, that might also require intermediate steps or even calculated columns, the PQ-approach might be faster. Definitely sth to have a look into. But if you want to do this precalculation in PQ and then pass the results into PP to further work with it – just make sure that you clean up your table and get rid of all index columns (just left them in order to make reconciliation easier) and other columns you don’t really need there, this would bring down PP performance!


    Imke


    Sunday, April 19, 2015 6:48 AM
    Answerer
  • The thing is that the RSI doesn´t use simple averages.
    The formulas noted on cells U20 and V20 use the 14 (period) average only as a starting point and that´s the problem.



    U20 =((U19*13)+S20)/14
    V20 =((V19*13)+T20)/14
    (The 14 period averages are on the line 19)


    I could make the averages as calculated fields but the problem are still the formulas based on those.

    Isn´t it possible to use the average only as a starting point?

    • Edited by NicoPer Sunday, April 19, 2015 10:28 PM
    Sunday, April 19, 2015 10:26 PM
  • Hi Nico,

    thanks for the freshup - it really floates :-)

    Should also be doable with PQ, but I will need some time to put it together - have a busy week, so if someone else would step in ... welcome :-)


    Imke

    Monday, April 20, 2015 6:53 AM
    Answerer
  • So, just published:

    http://blog.gbrueckl.at/2015/04/recursive-calculations-powerpivot-dax/

    just take a look at it and see if it helps you!

    -gerhard


    Gerhard Brueckl
    blogging @ http://blog.gbrueckl.at
    working @ http://www.pmOne.com

    Monday, April 20, 2015 12:23 PM
    Answerer
  • Imke, I highly appreciate your support. Thanks so much for taking the time. I´m learning a lot.

    Gerhard, thanks. I ´ll take a look and update.
    Monday, April 20, 2015 9:21 PM
  • Gerhard, that's pretty cool - but I couldn't figure out how to utilize it for this case.

    Quickly put a PQ solution together: https://onedrive.live.com/edit.aspx?cid=DE165DDF5D02DAFF&resid=de165ddf5d02daff%218260&app=Excel

    It works and I see it as a strange mixture between awesome & awkward :-) But am in a hurry, that's all I can deliver at the moment.


    Imke

    Tuesday, April 21, 2015 7:57 AM
    Answerer
  • yes, as I said, for this scenario my solution is not working as it is not possible to calculate the multiplier which needs to be used in order to get the correct result for the next month as the formula is more complex and references current months values and also previous months values

    this is more of an mathematical problem than a technical one :(


    Gerhard Brueckl
    blogging @ http://blog.gbrueckl.at
    working @ http://www.pmOne.com

    Tuesday, April 21, 2015 5:48 PM
    Answerer
  • Wow, that is.. Wunderbar!!

    Awesome lateral thinking. Still studying it..
    No worries about the time.

    Is the table on the worksheet needed or we can have the RSI cleaned up in the model only?
    It would be great to be able to manage more than 1 million rows (already have 500k).

    Tuesday, April 21, 2015 9:29 PM
  • :-)

    No need to have anything on a worksheet: You just connect to your query and load to the data model.

    I’ve optimized the code (Table.FromColumns eliminates the need for the separate lists) – now it’s all in one query (TAvgAll), speed has improved!

    https://onedrive.live.com/edit.aspx?cid=DE165DDF5D02DAFF&resid=de165ddf5d02daff%218400&app=Excel

    But with that many rows you might need to split up this procedure, in order to improve performance/getting it through at all.

    Following steps could help:

    -        Horizontally split your data table by SYMBOL, then unite by an append-query which directly loads into PP

    -        Eliminate the iteration calculation into a separate file – just adjust the filepath:

    https://onedrive.live.com/edit.aspx?cid=DE165DDF5D02DAFF&resid=de165ddf5d02daff%218398&app=Excel

    Good luck!


    Imke

    Friday, April 24, 2015 7:04 PM
    Answerer
  • Awesome :)

    I´m having a hard time changing the sources though.
    Trying to join the appended query with your last query.

    1) Here´s the appended query:

    let
        Source = Table.Combine({MonthlyQuery,Daily})
    in
        Source


    2) Here´s yours: TblAvgAll(2)

    let
        fnFirst14_ = (SYMBOL) =>
    
    
    let
        Source= Excel.Workbook(File.Contents("C:\Users\Imke\Documents\BI\Foren\Antworten\TN_RSI-On-
    
    PowerpivotCleanDataSourcev3.xlsx")),
        Data_Sheet = Source{[Item="Data",Kind="Sheet"]}[Data],
        Header = Table.PromoteHeaders(Data_Sheet),
        Filter = Table.SelectRows(Header, each ([SYMBOL] =SYMBOL)),
        KeepFirst14 = Table.FirstN(Filter,14)
    in
        KeepFirst14,
    //let 
    fnRSI_ = (AvgGain14, AvgLoss14, SYMBOL) =>
    
    let
        Source2= Excel.Workbook(File.Contents("C:\Users\Imke\Documents\BI\Foren\Antworten\TN_RSI-On-
    
    PowerpivotCleanDataSourcev3.xlsx")),
        Data_Sheet = Source2{[Item="Data",Kind="Sheet"]}[Data],
        Header = Table.PromoteHeaders(Data_Sheet),
        FilterSymbol = Table.SelectRows(Header, each ([SYMBOL] = SYMBOL)),
        RemoveFirst13 = Table.Skip(FilterSymbol,13),
        RemoveFirst14 = Table.Skip(FilterSymbol,14),
        ListGain = List.Buffer(RemoveFirst14[Gain]),
        ListLoss = List.Buffer(RemoveFirst14[Loss]),
        ListDate = List.Buffer(RemoveFirst13[DATEA]),
    Iterate = List.Generate(
    ()=>[Counter=0, Value_=AvgGain14],
    each [Counter]<=List.Count(ListGain),
    each [Counter=[Counter]+1,
    Value_=([Value_]*13+ListGain{[Counter]})/14],
    each [Value_]),
    
    IterateLoss = List.Generate(
    ()=>[Counter=0, Value_L=AvgLoss14],
    each [Counter]<=List.Count(ListLoss),
    each [Counter=[Counter]+1,
    Value_L=([Value_L]*13+ListGain{[Counter]})/14],
    each [Value_L]),
    
    Table = Table.FromColumns({Iterate, IterateLoss, ListDate})
    
    
    in
    Table,
    
     Quelle = Excel.Workbook(File.Contents("C:\Users\Imke\Documents\BI\Foren\Antworten\TN_RSI-On-
    
    PowerpivotCleanDataSourcev3.xlsx")),
        Data_Sheet = Quelle{[Item="Data",Kind="Sheet"]}[Data],
        #"Erste Zeile als Header" = Table.PromoteHeaders(Data_Sheet),
        #"Entfernte Duplikate1" = Table.Distinct(#"Erste Zeile als Header", {"SYMBOL"}),
        #"Andere entfernte Spalten" = Table.SelectColumns(#"Entfernte Duplikate1",{"SYMBOL"}),
        #"Hinzugefügte benutzerdefinierte Spalte" = Table.AddColumn(#"Andere entfernte Spalten", "Function", each fnFirst14_
    
    ([SYMBOL])),
        #"Function erweitern" = Table.ExpandTableColumn(#"Hinzugefügte benutzerdefinierte Spalte", "Function", {"Gain", "Loss"}, 
    
    {"Gain", "Loss"}),
        #"Gruppierte Zeilen" = Table.Group(#"Function erweitern", {"SYMBOL"}, {{"AvgGain14", each List.Average([Gain]), type 
    
    number}, {"AvgLoss14", each List.Average([Loss]), type number}}),
        #"Hinzugefügte benutzerdefinierte Spalte1" = Table.AddColumn(#"Gruppierte Zeilen", "AvgGain", each fnRSI_([AvgGain14], 
    
    [AvgLoss14], [SYMBOL])),
        #"AvgGain erweitern" = Table.ExpandTableColumn(#"Hinzugefügte benutzerdefinierte Spalte1", "AvgGain", {"Column1", 
    
    "Column2", "Column3"}, {"AvgGain", "AvgLoss", "DATEA"}),
        #"Entfernte Spalten" = Table.RemoveColumns(#"AvgGain erweitern",{"AvgGain14", "AvgLoss14"})
    in
        #"Entfernte Spalten"

    Getting all sorts of errors.



    • Edited by NicoPer Saturday, April 25, 2015 3:19 PM
    Saturday, April 25, 2015 3:13 PM
  • Hi Nico,

    if you want to connect directly to your query (append) - this corresponds to the Query "YourSourceQuery" in my first workbook. In this case, you don't use the second workbook at all.

    You only need to adjust Query "Step1", not TAvgAll: Replace "YourSourceQuery" by the name of your Append-Query (1).

    That should be all there is to do :-)

    One more hint re performance: Increase your swap file, if RAM tends to be the problem.

    The second workbook would only be used if you need to split the process up. But in that case,, the result of the "Step1" would need to sit in a sheet in the workbook, there's no other way to connect to data in an Excel-workbook.

    Please let me know how it worked!


    Imke

    Saturday, April 25, 2015 3:33 PM
    Answerer
  • No luck yet..

    I´m copying the queries to the original workbook (only 53K rows now). 

    The Step1 query is loading without problems even using the appended query.

    But when I put TAvgAll it starts doing something (?) and never stops (see picture below).
    I even changed the appended query and use only the Historical Monthly query.
    You can see in the picture below => 149mb and rising.

     

    I also checked starting with YourSourceQuery and renaming the query later as you said, but got the same problem.
    (I also added 100 mb cache and have 8gb ddr3)

    Maybe it´s too much data?
    Was it working in your end with 200k rows?

    I wonder if I´m doing something wrong and what could it be.


    • Edited by NicoPer Saturday, April 25, 2015 11:27 PM
    Saturday, April 25, 2015 11:25 PM
  • 200 k went through in 10 mins, but speed seems to decrease exponentially.

    I think we need a different approach. Will come back with it on Tuesday (busy).


    Imke

    Sunday, April 26, 2015 5:13 PM
    Answerer
  • Ok, np.

    Yes, I noted the exponential increase.

    The machine starts making noise and the cpu resources topping so I´m closing the file before it ends.

    Maybe separating it into smaller queries (I say without knowing anything:)

    Sunday, April 26, 2015 7:21 PM
  • I’ve tuned again & now you can choose between:

    1. Processing 500k rows in under 5 minutes (if you take it in 3 steps – meaning 3 different excel-files after another) – if you reach the excel-row-limit, you just add step 1+2 in parallel and do an append query in step 3 on them: step1, step2, step3
    2. Processing 500k rows in about 1 hour, if you take it in 1 step (100k would process in 1,5 minutes (!), but speed here decreases immensely): AllIn1

    This was quite an adventure – amazed about the differences in processing.

    Please let me know how they behave with your dataset J

    If you run into trouble, check whether your pagefile.sys is big enough.


    Imke

    Thursday, April 30, 2015 5:11 PM
    Answerer
  • Awesome,  thank you so much!


    Wanted to tell you that the speed problem got me thinking. Yesterday I remembered some problem I had with Access a while ago: was trying to get the All Time Max Close for every symbol (1000s) in the database and wanted to build it directly in Access.

    The developer built a query/code to get it, but there were millions of rows (all symbols End of Day data, with some of them having 10s of years..) and it took a huge amount of time and resources to complete.

    The problem was that the script was checking the max price of every symbol in -every row/day-! (It started somewhat fast, but then began to exponentially slow down to turtle speed).
    The solution was to just check the previous day Max, and if it was bigger update the current row.. And that made the daily updates a breeze, updating 1000s of symbols daily almost instantly.
    Unfortunately this is much more complex than that.


    Ok, will check them and come back.


    • Edited by NicoPer Friday, May 1, 2015 3:04 AM
    Friday, May 1, 2015 3:01 AM
  • Imke can you please upload your sample file from:
    C:\Users\Imke\Documents\BI\Foren\Antworten\TN_RSI__DatenquelleNeuV2.xlsx ?

    I´m still stuck in the replacement. Just want to make sure I have the exact same file structure.
    (yes, I´m a bit dumb:)

    Friday, May 1, 2015 9:52 PM
  • Oops, just recognized that I added the Column "Binder" in the data source already:

    DataSource:

    https://onedrive.live.com/edit.aspx?cid=DE165DDF5D02DAFF&resid=de165ddf5d02daff%218628&app=Excel

    Sorry about the confusion.

    Added another file where this is done in the first query & also refers to an internal query, so you can include step1 in the file where you collect your data (difference to the query referencing external data is "PromoteFirstRowAsHeader!):

    https://onedrive.live.com/edit.aspx?cid=DE165DDF5D02DAFF&resid=de165ddf5d02daff%218626&app=Excel


    Imke

    Saturday, May 2, 2015 5:25 AM
    Answerer
  • Having millions of rows seems to be too difficult to manage and I think we could still manage to get the most relevant info with only 100k rows or less.

    If you say we could have 100k rows file updates with an external file within 2 mins, that sounds like the ideal solution.

    So I´m using the workbook with the Allin1 file: I´ve copied the same structure but still getting different errors. Now about the ordering of "symbol" and "datea" columns.

    You´ve done so much already that I´d hate to ask for a bit more help. Is just that there are so many details that could be stopping this from working. It could really help if you see the book yourself.

    Below is the workbook with 50k rows (all monthly ETFs data from 2011 to date) to use as the external file:
    External Data


    Could you please use it as an external file with Allin1?
    Last thing, I promise :)

    Saturday, May 2, 2015 9:23 PM
  • Hi Nico,

    I’ve integrated the code in your file:

    ExternalMonthAllInOne.xlsx

    But I doubt you will be happy with the performance of this solution (your 50k-dataset took 20 minutes – my guess for 200k will be nearly 2 hours. Reason are the many unique SYMBOLs).

    Also adjusted the 3-step-files. They perform so much better (every query on your dataset well under a minute!).

    Step1: ExternalMonthly.xlsx

    Step2: TN_RSI_PQ_Step2.xlsx

    Step3: TN_RSI_PQ_Step3.xlsx

    You just need to adjust the path of the source files in Step2 and Step3:

    In Step2 edit query: Step1.

    Applied step “Source” will be selected already.

    Easiest way to adjust the path of your source query is to klick on the wheel (2) the following dialog will pop up:

    Click Browse and choose the folder, where the new files are in, doublecklick the new file:

    Same for step1 and step2 in Step3.xlsx and you’re done – forever J

    Enjoy the speed – it’s a difference between day and night!


    Imke

    • Marked as answer by NicoPer Sunday, May 3, 2015 8:24 PM
    Sunday, May 3, 2015 11:42 AM
    Answerer
  • Yes!! Finally!! I´m so happy :)

    Imke, you simply rock.. Thanks a million!!

    Still need to run some tests but all seems to be working perfectly.
    I´ll start playing with the new toy and will try to post a chart or two for the people reading this.

    Note: None of the stock screeners available today gives you the versatility that you can get with your own database and these tools.

    Must thank Imke for making an awesome solution available to everyone.

    Sunday, May 3, 2015 8:38 PM
  • Hi Nico,

    thanks for the flowers J

    Very much looking forward my real reward: Seeing the charts where all comes to life!

    So if there’s still anything not going right, please don’t hesitate to come back again.


    Imke

    Tuesday, May 5, 2015 5:12 PM
    Answerer
  • Fair enough :)

    There were not really the charts that I was after, but a versatile screener.
    The current data query doesn´t have OHLC, just the close and only the last 4 years of monthly data, so the charts wouldn´t be pretty either.

    A screener on the other hand allows to do some magic I couldn´t find elsewhere.
    With it we´d know where to look to catch the big moves, taking the RSI as a measure of sentiment.

    Say for example that we want to know the companies/etfs/currencies/whatever showing the highest/lowest  RSI of the last 3 years on the monthly time frame. (still got to do that but I think I could manage:)


    I wrote the indicator below the following sample charts on TOS.
    Is showing the 3Y highest (green), 3Y lowest (red) and the monthly RSI (gray).  
    The 3y highest or lowest RSI is usually the start of a big run as you can see on the samples below.

    US Dollar


    OIL



    Cool isn´t it?
    Obviously there´s the need of confirmation using other instruments, but you get the idea.
    You can also build custom indices like sub-industries using the market cap and apply the RSI on those.

    And now we can do all that on excel thanks to your invaluable help :)
    I´ll be working on it the following days and post if something comes up.
    Tuesday, May 5, 2015 10:21 PM