efficient frontier

I need to plot the efficient frontier for minimum variance portfolio.  I do not have the function "portopt"  How can I do this without it.  Thanks
0
Elvis
10/17/2010 3:08:03 AM
comp.soft-sys.matlab 208614 articles. 8 followers. lunamoonmoon (258) is leader. Post Follow

1 Replies
736 Views

Similar Articles

[PageSpeed] 33
On Oct 16, 10:08=A0pm, "Elvis " <emv...@nyu.edu> wrote:
> I need to plot the efficient frontier for minimum variance portfolio. =A0=
I do not have the function "portopt" =A0How can I do this without it. =A0Th=
anks

How about something like:

R =3D [.04 .07];
Vol =3D [.2 .3];
rho=3D0.5;
efficientFrontier(R, Vol, rho)

function efficientFrontier(assetReturn, assetVolatility, rho)
  asset1Weight =3D -10:0.01:10;
  asset2Weight =3D 1-asset1Weight;
  Y=3DassetReturn(1).*asset1Weight+assetReturn(2).*asset2Weight;
  X=3Dsqrt((assetVolatility(1)*asset1Weight).^2+
(assetVolatility(2)*asset2Weight).^2+2*rho*assetVolatility(1)*assetVolatili=
ty(2)*asset1Weight*diag(asset2Weight));
  plot(X,Y);
  xlabel('Volatility');
  ylabel('Returns');
end
0
boost
10/26/2010 8:30:58 PM
Reply:
Similar artilces about - efficient frontier:

Most efficient way to implement Python skimage's view_as_windows function (i.e. generalized n-d im2col)?
Hi, Using Python scikit-image's view_as_windows function, one can easily generate the sliding window view of any n-dimensional array (e.g. a 640*480*3 input array with 10*10*3 sliding window will have a 631*471*1*10*10*3 output array) extremely fast. However, several Matlab implementations (nested for-loop memory copying, matrix indexing method like im2col's, etc.) I tried to come up with are not as efficient as I expected. The fastest method I have now is as follows: function X = vaw(X, ws) for i = 1:ndims(X) SI = repmat(1:ws(i), [(size(X,i)-ws(i)+1) 1]) + repmat((0:(siz...

Re: Efficiently counting things other than records, #7
Ah, this looks excellent--thanks very much! I'm going to try just doing these types: mrn* general_class* gender* age_group mrn* specific_class* gender* age_group And then sum across (gender) and (gender age_group) to get the other counts I need. Thanks! -Roy -----Original Message----- From: Peter Crawford [mailto:peter.crawford@BLUEYONDER.CO.UK] Sent: Wednesday, March 22, 2006 6:42 AM To: SAS-L@LISTSERV.UGA.EDU; Pardee, Roy Subject: Re: Efficiently counting things other than records, by several levels of aggregation On Wed, 22 Mar 2006 05:36:09 -0800, Pardee, Roy <pardee.r@...

Re: a more efficient approach than using array? #14
hi, OK. a correction. as Paul showed before, just a boolean expression is not evaluated with short-circuiting. A boolean expression in the IF ... THEN statement is. So, here is an updated one. notice that the data step now runs even faster. On my pc, it ran in less than a second (0.88). cheers, chang /* Paul^s test data */ data meds1; array PRCDR_CD [ 6] $ 4; array DGNS_CD [10] $ 4; do id = 1 to 1e6; do _n_ = 1 to 10; if _n_ <= 6 then prcdr_cd[_n_] = put (ceil (ranuni(1) * 50) + 4700, z4.); dgns_cd[_n_] = put (ceil (ranuni(1) * 50) + 5400, z4...

FAQ 3.19 How can I make my CGI script more efficient? #25
This is an excerpt from the latest version perlfaq3.pod, which comes with the standard Perl distribution. These postings aim to reduce the number of repeated questions as well as allow the community to review and update the answers. The latest version of the complete perlfaq is at http://faq.perl.org . -------------------------------------------------------------------- 3.19: How can I make my CGI script more efficient? Beyond the normal measures described to make general Perl programs faster or smaller, a CGI program has additional issues. It may be run several times per secon...

Is there an efficient way to do this?
Suppose I have two inputs which are one nx2 cell matrix A and one mx2 cell matrix B: A={{'a','b'}, {'c'};{'d'},{'e','f'};...}; B={{'a'},{'c'};{'d','g'},{'e'};...} The output I need is similar to ismember(A,B,'rows') . The difference is that as long as the A{i,1} is a member of B{j,1}, I will say A{i,1}=B{j,1}, so the output for above two metric will be tf=[1 1 ...] I have a very long list of A and B, so I want to avoid to use for loop. Thanks in advance! "Wen" wrote in mess...

Efficiency of sqlldr
Hello - I have 20 database tables. The table spaces are configured on different disks. I have 10 Million rows of data for each table. (20 files with 10M rows each per table) Load Option #1: --------------- Use Oracle sqlldr (direct path) for all the 20 files with 10M rows of data each, into the respective table. One table is loaded after another. Load Option #2: --------------- Create batches of files, say 10,000 batches each with set of 20 files containng 1000 rows per file. Then I sqlldr the batches of files. One batch is processed after another. The 20 tables are populated in batches of ...

Re: Question about efficient data extraction #7
On Wed, Jun 4, 2008 at 3:17 PM, Nordlund, Dan (DSHS/RDA) < NordlDJ@dshs.wa.gov> wrote: > I have been helping new SAS users in my office get up to speed on accessing > a large SAS data repository. In the process I have discovered that at least > one of the approaches I have used historically is very inefficient (simle > data step merge). I thought I would describe a common type of task we do, > "benchmark" some approaches, and ask for ideas for further improvement. > > We do a lot of analyses of Medicaid claims data for various subpopulations > of Medic...

could be this more efficient?
Dear Matlab users, please help me if possible to make this code more efficient. k =1 for i=1:end if B(i) <= 1 A(k)=B(i); k=k+1; end end The point is how to select data from matrix B when B(i) <=1 without using loop. Is it possible at all? Thank you very much for remarks. Irek Try: A=B(B<=1); Regards, Stefan irek wrote: > > > Dear Matlab users, please help me if possible to make this code > more > efficient. > > k =1 > for i=1:end > if B(i) <= 1 > A(k)=B(i); > k=k+1; > end > end > > The point is how to select da...

with/inline view efficiency question
I am using "Oracle Database 10g Enterprise Edition Release 10.2.0.1.0 - Production" I am building some views. I am using a "WITH" clause that selects a large set of items, and then the main query joins with the "with" query on one of the fields so as to filter that result to get just the rows that are needed. I find this much easier to test as the with query can be run as-is to view all the data it would be selecting. Also, it is much less work for me when the main query needs to use multiple columns from the same sub query as the WITH sql is only...

how to calculate the energy efficiency using OFDM and MIMO in diversity methods
how to calculate energy efficiency in diversity methods ...

Efficient floodfill algorithm
G is two-dimensional matrix (for example, binarized image) On the picture 800x662 working time was 0.662 seconds. FloodFill = Compile[{{G, _Real, 2}, {p1, _Integer}, {p2, _Integer}, {tar, _Real}, {rep, _Real}}, Module[{Gdata = G, Q = {{p1, p2}}, QNew = {{0, 0}}, ind, x, y, u, dim, MaxLen, Maxl}, If [Gdata[[p1, p2]] != tar, Return[Gdata]]; dim = Dimensions[Gdata]; QNew = Table[{0, 0}, {i, 1, 2*Total[dim]}]; While [Length[Q] > 0, ind = 0; For [u = 1, u <= Length[Q], u = u + 1, {x, y} = Q[[u]]; If [Gdata[[x, y]] == tar, ...

Most efficient way to run update query
Hi all, Any thoughts on the best way to run an update query to update a specific list of records where all records get updated to same thing. I would think a temp table to hold the list would be best but am also looking at the easiest for an end user to run. The list of items is over 7000 Example: update imitmidx_sql set activity_cd = 'O', activity_dt = '20060601', prod_cat = 'OBS' where item_no = '001-LBK' update imitmidx_sql set activity_cd = 'O', activity_dt = '20060601', prod_cat = 'OBS' where item_no = '001-LYE' update...

ISO: efficient way to set xterm keyboard translations
Currently, I have a need to set a large series of keyboard translations in an xterm. While we have a options file that results in a mapping, there's a deployment concern about the solution. The concern is this - the mapping is about 80 lines of key mapping information. For a user to define just one key mapping of their own, it appears they have to include without modification all 80 lines of our key mapping. This leaves the xterm 'vulnerable' to accidental changes that may disable a number of applications reliant on the standard site mappings. What are the opti...

Efficient Networks 5360 and PPPOA
I have aquired at Efficient Networks 5360 modem which I want to use as a backup for my router. I want to connect the EN5360 direct to the Win2K PC as below ETHERNET ADSL Win2KPC ------------>EN5360----------->PPPOA ISP (UK ISP) I understand that the EN5360 is a bridge with no firmware and drivers are required on a PC or router to allow connection through it. According the tech specs, the EN5360 supports PPPOA Searching the web, it is implied that I require some PC software to get this to work, I have seen RASPPPOE and ENTERNET 300 which support PPPOE, ...

Linksys and Efficient DSL modem/router issues
Sorry for the prevous post with no subject (stupid computers! -- yeah) My DSL provider supplies an Efficient Network 5861 modem/router. The router section is set to "bridged" mode by the ISP (disabling the router/firewall feature) and I have a Linksys BEFSR41 behind the 5861. The BEFSR41 is set to be a DHCP server on the local side and fixed IP, DNS addresses on the WAN side. I had been using the Linksys as a DHCP client behind a simple DSL modem before my previous ISP (DirectPC) left the marketplace. I never had any complaints about stability or performance with DirectPC. Now, the...

Which is More Efficient?
I have a program that uses up a lot of CPU and want to make it is efficient as possible with what I have to work with it. So which of the following would be more efficient, knowing that l is a list and size is a number? l=l[:size] del l[size:] If it makes a difference, everything in the list is mutable. Measure it and find out. Sounds like a little investment in your time learning how to measure performance may pay dividends for you. "Dustan" <DustanGroups@gmail.com> writes: > I have a program that uses up a lot of CPU and want to make it is > efficient as possibl...

Efficient memory truncation of a matrix
Is it possible to truncate a matrix without generating a temporary copy of the matrix in Matlab? I am trying to do the following: a = a(1:count,:); This doesn't work however if a is very large as it generates: ??? Out of memory. Type HELP MEMORY for your options. However I am only truncating the existing matrix (or hoping to), so I am assuming that a copy of the data is occurring, before reassigning that copy to the original matrix. Thoughts? Thanks. "Jim" <jah104@pitt.com> wrote in message <h0d7cn$rj9$1@fred.mathworks.com>... > Is it possible to truncate a...

Efficiency
When using slot-value and the same value is used more than once in a form, is it more efficient to use a LET to extract that value (and use the local variable repeatedly) or does it not make a difference? In fact, is using LET ineffient? Or is the difference so slight it doesn't really make a difference in most normal programming? WoodHacker <RamsayW@comcast.net> wrote: > When using slot-value and the same value is used more than once in a > form, is it more efficient to use a LET to extract that value (and use > the local variable repeatedly) or does it not make a...

c2d efficiency
I was just curious, does anyone know what the disadvantages (if any) of using the c2d function instead of just doing a transformation (ex. bilinear) by hand? Thanks ...

Re: Efficient Exclusion Method Sought #6 1547374
On Mon, 16 May 2005 11:54:27 -0400, Talbot Michael Katz <topkatz@MSN.COM> wrote: >Hi. > >Suppose I have a data set / table containing an ID field (among others); I >also have a separate list of IDs. I want to pull all the records from the >data set whose IDs are not on the list. I can do this as follows: > >proc sql; > create table filtab1 as > select * > from tab1 > where id1 not in (select id1 from listex); >quit; > >Now, imagine that tab1 and listex both have millions of records. The query >above does not perform especially well; either...

How to efficiently remove large directory tree
Hi, A oracle process has tried to dump a core but created recursively a very deeply nested directory structure like this instead: core_8286/core_8286/core_8286/core_8286/....etc etc. until the inodes ran out. The amount of subdirectories is estimated to be almost 100,000 looking at the increase of used inodes on the filesystem (vxfs on HPUX 11.0). The process of creating these dirs has stopped now. After enlarging the fs, there is free space and enough free inodes available. I'm trying to get rid of all the subdirs. I've tried to do a # rm -r from the first core_8286 level, but a...

What is more efficient?
Let's say I have a class with few string properties and few integers, and a lot of methods defined for that class. Now if I have hundreds of thousands (or even more) of instances of that class - is it more efficient to remove those methods and make them separate functions, or it doesn't matter? Thanks... -- _______ Karlo Lozovina - Mosor | | |.-----.-----. web: http://www.mosor.net || ICQ#: 10667163 | || _ | _ | Parce mihi domine quia Dalmata sum. |__|_|__||_____|_____| On Feb 19, 2:17 pm, Karlo Lozovina <...

How to measure code-efficiency?
Every now and then and found two ways to resolve a problem. I try to find a way to decide which one is best (at least speed-wise) but I don't know how do I test how long those it take a program to run? Gaijinco wrote: > Every now and then and found two ways to resolve a problem. I try to > find a way to decide which one is best (at least speed-wise) but I > don't know how do I test how long those it take a program to run? I would you a regular timer (one of those that are used during marathons): start in one hand, while you press enter with a different one. Good luck... ...

How to solve linear systems efficiently
Hello! Is there a way to solve the linear system x=-b/A, where A is a overdetermined, rectangular, complex, double matrix with ~40% zeros, more memory and/or time efficient than mrdivide? Since I've got 8Gb main memory, the size of the matrix is limited to below 4Gb, but I'd like to solve the equation for larger matrixes and, because I've got to solve many matrixes, I'd like to do it as fast as possible (in case it's impossible to solve bigger matrixes, it would be also helpful to solve the <4Gb-matrixes faster). Thanks, Jan Paskarbeit On Mar 3, 9:39=A0am, "Jan...

efficiency in compare
All, I need an efficiency improvement... I have a single column of numbers in variable speed. With a for loop and an if statement I can look at each value and determine if it is 700 and below. I would think there is a MATLAB function that would do the same thing. Could some one please point me in the right direction in the help files to find it? I would like the function to return the first row number at which the value at that row is less than 700 (or whatever). Thank you in advance... Chmical Maybe : ind = find(vector <= 700) res = ind(1) Caroline Thank you very much that works gr...