锘??xml version="1.0" encoding="utf-8" standalone="yes"?>亚洲欧美成人网,欧美亚一区二区,久久久蜜臀国产一区二区http://www.shnenglu.com/guijie/category/20090.html鏉板摜濂?鍝堝搱!zh-cnTue, 09 Apr 2019 13:20:33 GMTTue, 09 Apr 2019 13:20:33 GMT60How to solve AX + XB = C for X using matlab?http://www.shnenglu.com/guijie/archive/2015/07/06/211161.html鏉板摜鏉板摜Mon, 06 Jul 2015 07:28:00 GMThttp://www.shnenglu.com/guijie/archive/2015/07/06/211161.htmlhttp://www.shnenglu.com/guijie/comments/211161.htmlhttp://www.shnenglu.com/guijie/archive/2015/07/06/211161.html#Feedback0http://www.shnenglu.com/guijie/comments/commentRss/211161.htmlhttp://www.shnenglu.com/guijie/services/trackbacks/211161.htmlX = sylvester(A,B,C)
http://cn.mathworks.com/help/matlab/ref/sylvester.html

鏉板摜 2015-07-06 15:28 鍙戣〃璇勮
]]>
Alternating optimizationhttp://www.shnenglu.com/guijie/archive/2015/05/24/210729.html鏉板摜鏉板摜Sun, 24 May 2015 04:58:00 GMThttp://www.shnenglu.com/guijie/archive/2015/05/24/210729.htmlhttp://www.shnenglu.com/guijie/comments/210729.htmlhttp://www.shnenglu.com/guijie/archive/2015/05/24/210729.html#Feedback0http://www.shnenglu.com/guijie/comments/commentRss/210729.htmlhttp://www.shnenglu.com/guijie/services/trackbacks/210729.html      鎴戜釜浜虹悊瑙o紝榪欏嚑涓蹇甸兘鏄瓑浠風殑銆?br />

‘alternating optimization’ or ‘alternative optimization’?

Sue (UTS) comment: ‘Alternating’ means you use this optimization with another optimization, one after the other. ‘Alternative’ means you use this optimization instead of any other.

鎴戠殑GSM-PAF鏈鍚庣敤鐨?/span>‘alternating optimization’



鏉板摜 2015-05-24 12:58 鍙戣〃璇勮
]]>
瀹屽叏鎺屾彙 鏈澶т技鐒朵及璁?/title><link>http://www.shnenglu.com/guijie/archive/2013/12/05/204609.html</link><dc:creator>鏉板摜</dc:creator><author>鏉板摜</author><pubDate>Thu, 05 Dec 2013 11:21:00 GMT</pubDate><guid>http://www.shnenglu.com/guijie/archive/2013/12/05/204609.html</guid><wfw:comment>http://www.shnenglu.com/guijie/comments/204609.html</wfw:comment><comments>http://www.shnenglu.com/guijie/archive/2013/12/05/204609.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.shnenglu.com/guijie/comments/commentRss/204609.html</wfw:commentRss><trackback:ping>http://www.shnenglu.com/guijie/services/trackbacks/204609.html</trackback:ping><description><![CDATA[<div style="font-family: Verdana, Arial, Helvetica, sans-serif; line-height: 25px; background-color: #ffffff"> <div style="font-family: Verdana, Arial, Helvetica, sans-serif; line-height: 25px; background-color: #ffffff">榪欐槸灞炰簬姒傜巼璁轟笌鏁扮悊緇熻涓弬鏁頒及璁$殑鍐呭錛岃鏁欐潗絎竷绔燩168錛涙ā寮忚瘑鍒瑪璁扮殑Section 3.11.1(Section 3.11鍒癝ection 3.11.1鐨勫唴瀹瑰簲璇ヨ浣?<br />鎬葷粨錛氭渶澶т技鐒跺嚱鏁頒及璁℃硶錛岄鍏堟槸鍋囪鎵寰楃殑鏍鋒湰鏈嶄粠鏌愪竴鍒嗗竷錛岀洰鏍囨槸浼拌鍑鴻繖涓垎甯冧腑鐨勫弬鏁幫紝鏂規硶鏄緱鍒拌繖涓緇勬牱鏈殑姒傜巼鏈澶ф椂灝卞搴斾簡璇ユā鍨嬬殑鍙傛暟鍊鹼紝鍐欏嚭浼肩劧鍑芥暟錛屽啀姹傚鏁幫紙寰楀埌瀵規暟浼肩劧錛夛紝鍐嶆眰瀵規暟浼肩劧鍑芥暟鐨勫鉤鍧囷紙瀵規暟騫沖潎浼肩劧錛夛紝鍐嶅鍏舵眰瀵鹼紝寰楀嚭鍙傛暟鍊箋傜洰鍓嶆垜鐞嗚В鐨勯渶瑕佹眰瀵規暟鐨勫師鍥犳槸錛岄氬父姒傜巼鏄皬鏁幫紝榪炰箻涔嬪悗浼氶潪甯稿皬錛屽璁$畻鏈鴻岃█錛屽鏄撻犳垚嫻偣鏁頒笅婧紝鎵浠ョ敤浜嗗彇瀵規暟銆?br />Zhengxia涔熸彁鍒拌繃浼肩劧(likelihood)灝辨槸姒傜巼錛岃嫻嬪埌鐨勬鐜囥?br /><a >https://en.wikipedia.org/wiki/Likelihood_function</a></div></div><img src ="http://www.shnenglu.com/guijie/aggbug/204609.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.shnenglu.com/guijie/" target="_blank">鏉板摜</a> 2013-12-05 19:21 <a href="http://www.shnenglu.com/guijie/archive/2013/12/05/204609.html#Feedback" target="_blank" style="text-decoration:none;">鍙戣〃璇勮</a></div>]]></description></item><item><title>How to use matlab solve optimization quadratic?http://www.shnenglu.com/guijie/archive/2012/11/21/195475.html鏉板摜鏉板摜Wed, 21 Nov 2012 10:31:00 GMThttp://www.shnenglu.com/guijie/archive/2012/11/21/195475.htmlhttp://www.shnenglu.com/guijie/comments/195475.htmlhttp://www.shnenglu.com/guijie/archive/2012/11/21/195475.html#Feedback0http://www.shnenglu.com/guijie/comments/commentRss/195475.htmlhttp://www.shnenglu.com/guijie/services/trackbacks/195475.html

鏉板摜 2012-11-21 18:31 鍙戣〃璇勮
]]>
Taylor series in several variableshttp://www.shnenglu.com/guijie/archive/2012/10/31/194113.html鏉板摜鏉板摜Wed, 31 Oct 2012 02:48:00 GMThttp://www.shnenglu.com/guijie/archive/2012/10/31/194113.htmlhttp://www.shnenglu.com/guijie/comments/194113.htmlhttp://www.shnenglu.com/guijie/archive/2012/10/31/194113.html#Feedback0http://www.shnenglu.com/guijie/comments/commentRss/194113.htmlhttp://www.shnenglu.com/guijie/services/trackbacks/194113.htmlhttp://en.wikipedia.org/wiki/Taylor_series

Taylor series in several variables

The Taylor series may also be generalized to functions of more than one variable with

T(x_1,\dots,x_d) = \sum_{n_1=0}^\infty \sum_{n_2=0}^\infty \cdots \sum_{n_d = 0}^\infty  \frac{(x_1-a_1)^{n_1}\cdots (x_d-a_d)^{n_d}}{n_1!\cdots n_d!}\,\left(\frac{\partial^{n_1 + \cdots + n_d}f}{\partial x_1^{n_1}\cdots \partial x_d^{n_d}}\right)(a_1,\dots,a_d).\!

For example, for a function that depends on two variables, x and y, the Taylor series to second order about the point (ab) is:

 \begin{align} f(x,y) & \approx f(a,b) +(x-a)\, f_x(a,b) +(y-b)\, f_y(a,b) \\ & {}\quad + \frac{1}{2!}\left[ (x-a)^2\,f_{xx}(a,b) + 2(x-a)(y-b)\,f_{xy}(a,b) +(y-b)^2\, f_{yy}(a,b) \right], \end{align}

where the subscripts denote the respective partial derivatives.

A second-order Taylor series expansion of a scalar-valued function of more than one variable can be written compactly as

T(\mathbf{x}) = f(\mathbf{a}) + \mathrm{D} f(\mathbf{a})^T (\mathbf{x} - \mathbf{a})  + \frac{1}{2!} (\mathbf{x} - \mathbf{a})^T \,\{\mathrm{D}^2 f(\mathbf{a})\}\,(\mathbf{x} - \mathbf{a}) + \cdots\! \,,

where D f(\mathbf{a})\! is the gradient of \,f evaluated at \mathbf{x} = \mathbf{a} and D^2 f(\mathbf{a})\! is the Hessian matrix. Applying the multi-index notation the Taylor series for several variables becomes

T(\mathbf{x}) = \sum_{|\alpha| \ge 0}^{}\frac{(\mathbf{x}-\mathbf{a})^{\alpha}}{\alpha !}\,({\mathrm{\partial}^{\alpha}}\,f)(\mathbf{a})\,,

which is to be understood as a still more abbreviated multi-index version of the first equation of this paragraph, again in full analogy to the single variable case.

[edit]Example

Second-order Taylor series approximation (in gray) of a function f(x,y) = e^x\log{(1+y)}around origin.

Compute a second-order Taylor series expansion around point (a,b) = (0,0) of a function

f(x,y)=e^x\log(1+y).\,

Firstly, we compute all partial derivatives we need

f_x(a,b)=e^x\log(1+y)\bigg|_{(x,y)=(0,0)}=0\,,
f_y(a,b)=\frac{e^x}{1+y}\bigg|_{(x,y)=(0,0)}=1\,,
f_{xx}(a,b)=e^x\log(1+y)\bigg|_{(x,y)=(0,0)}=0\,,
f_{yy}(a,b)=-\frac{e^x}{(1+y)^2}\bigg|_{(x,y)=(0,0)}=-1\,,
f_{xy}(a,b)=f_{yx}(a,b)=\frac{e^x}{1+y}\bigg|_{(x,y)=(0,0)}=1.

The Taylor series is

\begin{align} T(x,y) = f(a,b) & +(x-a)\, f_x(a,b) +(y-b)\, f_y(a,b) \\ &+\frac{1}{2!}\left[ (x-a)^2\,f_{xx}(a,b) + 2(x-a)(y-b)\,f_{xy}(a,b) +(y-b)^2\, f_{yy}(a,b) \right]+ \cdots\,,\end{align}

which in this case becomes

\begin{align}T(x,y) &= 0 + 0(x-0) + 1(y-0) + \frac{1}{2}\Big[ 0(x-0)^2 + 2(x-0)(y-0) + (-1)(y-0)^2 \Big] + \cdots \\ &= y + xy - \frac{y^2}{2} + \cdots. \end{align}

Since log(1 + y) is analytic in |y| < 1, we have

e^x\log(1+y)= y + xy - \frac{y^2}{2} + \cdots

for |y| < 1.



鏉板摜 2012-10-31 10:48 鍙戣〃璇勮
]]>
Jensen's inequalityhttp://www.shnenglu.com/guijie/archive/2012/10/30/194080.html鏉板摜鏉板摜Tue, 30 Oct 2012 04:04:00 GMThttp://www.shnenglu.com/guijie/archive/2012/10/30/194080.htmlhttp://www.shnenglu.com/guijie/comments/194080.htmlhttp://www.shnenglu.com/guijie/archive/2012/10/30/194080.html#Feedback0http://www.shnenglu.com/guijie/comments/commentRss/194080.htmlhttp://www.shnenglu.com/guijie/services/trackbacks/194080.html

If λ1 and λ2 are two arbitrary nonnegative real numbers such that λ1 + λ2 = 1 then convexity of \scriptstyle\varphi implies

\varphi(\lambda_1 x_1+\lambda_2 x_2)\leq \lambda_1\,\varphi(x_1)+\lambda_2\,\varphi(x_2)\text{ for any }x_1,\,x_2.  [榪欏氨鏄嚫鍑芥暟鐨勫畾涔塢

This can be easily generalized: if λ1λ2, ..., λn are nonnegative real numbers such that λ1 + ... + λn = 1, then

\varphi(\lambda_1 x_1+\lambda_2 x_2+\cdots+\lambda_n x_n)\leq \lambda_1\,\varphi(x_1)+\lambda_2\,\varphi(x_2)+\cdots+\lambda_n\,\varphi(x_n),

渚嬪-log(x)鏄嚫鍑芥暟


鏉板摜 2012-10-30 12:04 鍙戣〃璇勮
]]>
Gradient Descent(姊害涓嬮檷娉?(涓や緥瀵瑰簲涓ょ墰鏂囧潎鐢ㄨ娉曟眰瑙g洰鏍囧嚱鏁?http://www.shnenglu.com/guijie/archive/2012/10/19/193522.html鏉板摜鏉板摜Fri, 19 Oct 2012 05:33:00 GMThttp://www.shnenglu.com/guijie/archive/2012/10/19/193522.htmlhttp://www.shnenglu.com/guijie/comments/193522.htmlhttp://www.shnenglu.com/guijie/archive/2012/10/19/193522.html#Feedback0http://www.shnenglu.com/guijie/comments/commentRss/193522.htmlhttp://www.shnenglu.com/guijie/services/trackbacks/193522.htmlhttp://en.wikipedia.org/wiki/Gradient_descent 
http://zh.wikipedia.org/wiki/%E6%9C%80%E9%80%9F%E4%B8%8B%E9%99%8D%E6%B3%95
 Gradient descent is based on the observation that if the multivariable function F(\mathbf{x}) is defined and differentiable in a neighborhood of a point \mathbf{a}, then F(\mathbf{x}) decreases fastest if one goes from \mathbf{a} in the direction of the negative gradient of F at \mathbf{a}-\nabla F(\mathbf{a}) 
涓哄暐姝ラ暱瑕佸彉鍖栵紵Tianyi鐨勮В閲婂緢濂斤細濡傛灉姝ラ暱榪囧ぇ錛屽彲鑳戒嬌寰楀嚱鏁板間笂鍗囷紝鏁呰鍑忓皬姝ラ暱 (涓嬮潰榪欎釜鍥劇墖鏄湪綰鎬笂鐢誨ソ錛岀劧鍚巗can鐨?銆?br />Andrew NG鐨刢oursera璇劇▼Machine learning鐨?span style="text-align: justify; text-transform: none; background-color: rgb(255,255,255); text-indent: 0px; letter-spacing: normal; display: inline !important; font: 13px/18px Verdana, Helvetica, Arial; white-space: normal; float: none; color: rgb(94,94,94); word-spacing: 0px; -webkit-text-stroke-width: 0px">II. Linear Regression with One Variable鐨?span style="font-family: 'Calibri','sans-serif'; font-size: 10.5pt; mso-bidi-font-size: 11.0pt; mso-ascii-theme-font: minor-latin; mso-fareast-font-family: 瀹嬩綋; mso-fareast-theme-font: minor-fareast; mso-hansi-theme-font: minor-latin; mso-bidi-font-family: 'Times New Roman'; mso-bidi-theme-font: minor-bidi; mso-ansi-language: EN-US; mso-fareast-language: ZH-CN; mso-bidi-language: AR-SA" lang="EN-US">Gradient descent Intuition涓殑瑙i噴寰堝ソ錛屾瘮濡傚湪涓嬪浘鍦ㄥ彸渚х殑鐐癸紝鍒欐搴︽槸姝f暟錛?font size="2" face="Arial"> -\nabla F(\mathbf{a})鏄礋鏁幫紝鍗充嬌褰撳墠鐨刟鍑忓皬
渚?錛歍oward the Optimization of Normalized Graph Laplacian(TNN 2011)鐨凢ig. 1. Normalized graph Laplacian learning algorithm鏄緢濂界殑姊害涓嬮檷娉曠殑渚嬪瓙.鍙鐪婩ig1錛屽叾浠栦笉蹇呯湅銆侳ig1闄禨huning鑰佸笀璇句歡 闈炵嚎鎬т紭鍖栫鍏〉絎洓涓猵pt錛屽搴旀暀鏉怭124錛屽叧閿洿綰挎悳绱㈢瓥鐣ワ紝搴旂敤 闈炵嚎鎬т紭鍖栫鍥涢〉絎洓涓猵pt錛屾闀垮姞鍊嶆垨鍑忓嶃傚彧瑕佺洰鏍囧噺灝戝氨鍒頒笅涓涓悳绱㈢偣錛屽茍涓旀闀垮姞鍊嶏紱鍚﹀垯鍋滅暀鍦ㄥ師鐐癸紝灝嗘闀垮噺鍊嶃?br />渚?錛?nbsp;Distance Metric Learning for Large Margin Nearest Neighbor Classification(JLMR),鐩爣鍑芥暟灝辨槸鍏紡14錛屾槸鐭╅樀M鐨勪簩嬈″瀷錛屽睍寮鍚庡氨浼氬彂鐜幫紝鍏充簬M鏄嚎鎬х殑錛屾晠鏄嚫鐨勩傚M姹傚鐨勭粨鏋滐紝闄勫綍鍏紡18鍜?9涔嬮棿鐨勫叕寮忎腑娌℃湁M

鎴戣嚜宸遍澶栫殑鎬濊冿細濡傛灉鏄嚫鍑芥暟錛屽鑷彉閲忔眰鍋忓涓?錛岀劧鍚庡皢鑷彉閲忔眰鍑烘潵涓嶅氨琛屼簡鍢涳紝涓哄暐榪樿姊害涓嬮檷錛熶笂榪頒緥浜屾槸涓嶈鐨勶紝鍥犱負瀵筂姹傚鍚庝笌M鏃犲叧浜嗐傚拰tianyi璁ㄨ錛屾鍥犱負姹傚涓? 娌℃湁瑙f瀽瑙i噰鐢ㄦ搴︿笅闄嶏紝鏈夎В鏋愯В灝辯粨鏉熶簡

http://blog.csdn.net/yudingjun0611/article/details/8147046

1. 姊害涓嬮檷娉?/strong>

姊害涓嬮檷娉曠殑鍘熺悊鍙互鍙傝冿細鏂潶紱忔満鍣ㄥ涔犵涓璁?/a>銆?/span>

鎴戝疄楠屾墍鐢ㄧ殑鏁版嵁鏄?00涓簩緇寸偣銆?/span>

濡傛灉姊害涓嬮檷綆楁硶涓嶈兘姝e父榪愯錛岃冭檻浣跨敤鏇村皬鐨勬闀?涔熷氨鏄涔犵巼)錛岃繖閲岄渶瑕佹敞鎰忎袱鐐癸細

1錛夊浜庤凍澶熷皬鐨?  鑳戒繚璇佸湪姣忎竴姝ラ兘鍑忓皬錛?/span>
2錛変絾鏄鏋滃お灝忥紝姊害涓嬮檷綆楁硶鏀舵暃鐨勪細寰堟參錛?/span>

鎬葷粨錛?/span>
1錛夊鏋滃お灝忥紝灝變細鏀舵暃寰堟參錛?/span>
2錛夊鏋滃お澶э紝灝變笉鑳戒繚璇佹瘡涓嬈¤凱浠i兘鍑忓皬錛屼篃灝變笉鑳戒繚璇佹敹鏁涳紱
濡備綍閫夋嫨-緇忛獙鐨勬柟娉曪細
..., 0.001, 0.003, 0.01, 0.03, 0.1, 0.3, 1...
綰?鍊嶄簬鍓嶄竴涓暟銆?/span>

matlab婧愮爜錛?/span>

  1. function [theta0,theta1]=Gradient_descent(X,Y);  
  2. theta0=0;  
  3. theta1=0;  
  4. t0=0;  
  5. t1=0;  
  6. while(1)  
  7.     for i=1:1:100 %100涓偣  
  8.         t0=t0+(theta0+theta1*X(i,1)-Y(i,1))*1;  
  9.         t1=t1+(theta0+theta1*X(i,1)-Y(i,1))*X(i,1);  
  10.     end  
  11.     old_theta0=theta0;  
  12.     old_theta1=theta1;  
  13.     theta0=theta0-0.000001*t0 %0.000001琛ㄧず瀛︿範鐜?nbsp; 
  14.     theta1=theta1-0.000001*t1  
  15.     t0=0;  
  16.     t1=0;  
  17.     if(sqrt((old_theta0-theta0)^2+(old_theta1-theta1)^2)<0.000001) % 榪欓噷鏄垽鏂敹鏁涚殑鏉′歡錛屽綋鐒跺彲浠ユ湁鍏朵粬鏂規硶鏉ュ仛  
  18.         break;  
  19.     end  
  20. end  


2. 闅忔満姊害涓嬮檷娉?/strong>

闅忔満姊害涓嬮檷娉曢傜敤浜庢牱鏈偣鏁伴噺闈炲父搴炲ぇ鐨勬儏鍐碉紝綆楁硶浣垮緱鎬諱綋鍚戠潃姊害涓嬮檷蹇殑鏂瑰悜涓嬮檷銆?/span>

matlab婧愮爜錛?/span>

  1. function [theta0,theta1]=Gradient_descent_rand(X,Y);  
  2. theta0=0;  
  3. theta1=0;  
  4. t0=theta0;  
  5. t1=theta1;  
  6. for i=1:1:100  
  7.     t0=theta0-0.01*(theta0+theta1*X(i,1)-Y(i,1))*1  
  8.     t1=theta1-0.01*(theta0+theta1*X(i,1)-Y(i,1))*X(i,1)  
  9.     theta0=t0  
  10.     theta1=t1  
  11. end  



鏉板摜 2012-10-19 13:33 鍙戣〃璇勮
]]>
[zz]Newton Raphson綆楁硶http://www.shnenglu.com/guijie/archive/2012/10/16/193347.html鏉板摜鏉板摜Mon, 15 Oct 2012 23:21:00 GMThttp://www.shnenglu.com/guijie/archive/2012/10/16/193347.htmlhttp://www.shnenglu.com/guijie/comments/193347.htmlhttp://www.shnenglu.com/guijie/archive/2012/10/16/193347.html#Feedback0http://www.shnenglu.com/guijie/comments/commentRss/193347.htmlhttp://www.shnenglu.com/guijie/services/trackbacks/193347.htmlhttp://blog.csdn.net/flyingworm_eley/article/details/6517853 

Newton-Raphson綆楁硶鍦ㄧ粺璁′腑騫挎硾搴旂敤浜庢眰瑙LE鐨勫弬鏁頒及璁°?/p>

瀵瑰簲鐨勫崟鍙橀噺濡備笅鍥撅細

 

澶氬厓鍑芥暟綆楁硶錛?/p>

 

 

 

Example錛氾紙implemented in R錛?/p>

#瀹氫箟鍑芥暟f(x)

f=function(x){
    1/x+1/(1-x)
}

#瀹氫箟f_d1涓轟竴闃跺鍑芥暟

f_d1=function(x){
    -1/x^2+1/(x-1)^2
}

#瀹氫箟f_d2涓轟簩闃跺鍑芥暟

f_d2=function(x){
    2/x^3-2/(x-1)^3
}

 

#NR綆楁硶銆
NR=function(time,init){
    X=NULL
    D1=NULL   #鍌ㄥ瓨Xi涓闃跺鍑芥暟鍊?br />D2=NULL   #鍌ㄥ瓨Xi浜岄樁瀵煎嚱鏁板?br />    count=0

    X[1]=init
    l=seq(0.02,0.98,0.0002)
    plot(l,f(l),pch='.')
    points(X[1],f(X[1]),pch=2,col=1)

 

    for (i in 2:time){
        D1[i-1]=f_d1(X[i-1])
        D2[i-1]=f_d2(X[i-1])
        X[i]=X[i-1]-1/(D2[i-1])*(D1[i-1])   #NR綆楁硶榪唬寮?br />        if (abs(D1[i-1])<0.05)break 
        points(X[i],f(X[i]),pch=2,col=i)
        count=count+1
    }
    return(list(x=X,Deriviative_1=D,deriviative2=D2,count))
}


o=NR(30,0.9)

緇撴灉濡備笅鍥撅細鍥句腑涓嶅悓棰滆壊鐨勪笁瑙掑艦琛ㄧずi嬈¤凱浠d駭鐢熺殑浼拌鍊糥i

 

 

o=NR(30,0.9)

 

#鍙﹀彇鍑芥暟f(x)

f=function(x){
    return(exp(3.5*cos(x))+4*sin(x))
}

 

f_d1=function(x){
    return(-3.5*exp(3.5*cos(x))*sin(x)+4*cos(x))
}

 

f_d2=function(x){
    return(-4*sin(x)+3.5^2*exp(3.5*cos(x))*(sin(x))^2-3.5*exp(3.5*cos(x))*cos(x))
}

 

寰楀埌緇撴灉濡備笅錛?/p>

Reference from:

Kevin Quinn

Assistant Professor

Univ Washington



鏉板摜 2012-10-16 07:21 鍙戣〃璇勮
]]>
青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品
  • <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
      <noscript id="pjuwb"></noscript>
            <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
              <dd id="pjuwb"></dd>
              <abbr id="pjuwb"></abbr>
              国产精品免费视频xxxx| 欧美一区二区三区视频免费播放| 亚洲精品日韩在线| 免费亚洲视频| 午夜精品网站| 免费日韩av| 亚洲激情另类| 欧美 日韩 国产 一区| 亚洲欧美综合v| 亚洲高清在线| 国产精品h在线观看| 久久久久一区| 免费人成网站在线观看欧美高清| 亚洲国产精品成人久久综合一区| 久久精品一区蜜桃臀影院 | 另类国产ts人妖高潮视频| 久久久成人精品| 亚洲精品少妇网址| 亚洲精品欧美| 国产精品每日更新| 老司机精品福利视频| 蜜桃伊人久久| 亚洲婷婷综合色高清在线| 亚洲无毛电影| 亚洲国产成人av| 在线午夜精品自拍| 亚洲伦理在线观看| 久久不射中文字幕| 一区二区三区高清| 欧美精品一区二区三区很污很色的 | 久久噜噜亚洲综合| 亚洲在线1234| 另类天堂视频在线观看| 久久国产乱子精品免费女| 欧美色图麻豆| 在线视频精品一区| 亚洲一区制服诱惑| 国产精品人人做人人爽人人添| 欧美成人伊人久久综合网| 亚洲电影免费| 美女视频黄a大片欧美| 欧美+日本+国产+在线a∨观看| 娇妻被交换粗又大又硬视频欧美| 欧美一区二区精品久久911| 久久久亚洲影院你懂的| 国产精品有限公司| 亚洲精品日产精品乱码不卡| 免费不卡亚洲欧美| 久久久久91| 久久久999精品免费| 欧美激情a∨在线视频播放| 亚洲欧美国产不卡| 国产精品99久久久久久www| 亚洲一区二区三区精品视频| 裸体素人女欧美日韩| 欧美一级专区免费大片| 久久久久青草大香线综合精品| 亚洲欧美一区在线| 久久精品久久综合| 免费一级欧美片在线观看| 欧美一区二区国产| 在线免费精品视频| 免费中文日韩| 欧美一二三区精品| 夜夜嗨av一区二区三区四季av| 你懂的亚洲视频| 亚洲制服av| 亚洲看片免费| 国产精品自拍小视频| 久久gogo国模啪啪人体图| 亚洲黄一区二区三区| 国产精品萝li| 欧美精品久久一区二区| 欧美大色视频| 欧美黄色片免费观看| 欧美国产日韩在线观看| 欧美美女福利视频| 欧美精品一区二区三区蜜桃| 欧美日韩国产限制| 亚洲最新在线| 一区二区av| 久久网站热最新地址| 久久综合伊人77777| 久久国产视频网| 久久久噜噜噜久久中文字幕色伊伊 | 欧美丰满高潮xxxx喷水动漫| 一区二区三区精密机械公司| 欧美激情亚洲视频| 国产在线不卡视频| 欧美日本一区二区三区| 欧美一区二视频在线免费观看| 99精品国产福利在线观看免费| 欧美日韩在线视频一区| 欧美激情一区二区三区在线视频 | 久久精品视频亚洲| 在线精品视频一区二区三四| 久久九九久精品国产免费直播 | 欧美精品一卡| 国产精品亚洲人在线观看| 国产一区二区三区四区在线观看 | 欧美亚洲一区二区在线观看| 久久亚洲一区二区三区四区| 亚洲人线精品午夜| 午夜精品久久久久久99热软件| 美女精品自拍一二三四| 国产精品一区久久| 亚洲免费人成在线视频观看| 欧美大胆成人| 欧美视频日韩视频在线观看| 亚洲国产精品第一区二区| 亚洲综合精品| 亚洲国产日韩欧美在线99 | 美女精品在线观看| 亚洲午夜精品在线| 欧美国产日韩精品免费观看| 国产亚洲成av人在线观看导航| 一区二区三区导航| 亚洲区一区二| 欧美日韩伦理在线免费| 亚洲永久免费视频| 国产精品成人一区二区网站软件 | 亚洲一二三级电影| 国产精品高清一区二区三区| 一区二区免费在线观看| 99精品免费网| 国产精品国产| 久久精品久久99精品久久| 久久se精品一区精品二区| 狠狠久久亚洲欧美| 欧美电影资源| 国产精品成av人在线视午夜片| 亚洲欧美日韩成人| 久久免费的精品国产v∧| 亚洲激情第一区| 亚洲精品久久久久| 国产精品第十页| 免费看的黄色欧美网站| 欧美激情一区二区三区成人| 亚洲一区二区三区在线| 欧美亚洲三区| 亚洲视频免费| 美女视频黄a大片欧美| 亚洲欧美电影院| 久久综合导航| 久久国产精品久久久久久久久久| 久久久国产一区二区三区| 一级日韩一区在线观看| 久久精品国产亚洲一区二区三区| 亚洲三级性片| 久久久人人人| 久久免费偷拍视频| 国产精品美女一区二区| 亚洲国产精品美女| 激情视频一区二区| 午夜影视日本亚洲欧洲精品| 亚洲男女毛片无遮挡| 欧美区二区三区| 亚洲人精品午夜在线观看| 一区二区三区在线观看欧美| 亚洲欧美日韩高清| 免费成人激情视频| 久久一区二区三区国产精品| 国产欧美视频一区二区三区| 亚洲午夜av在线| 亚洲欧美电影院| 国产一区二区av| 午夜精品999| 老司机免费视频一区二区三区| 伊人伊人伊人久久| 久久伊人精品天天| 欧美国产高清| 亚洲一区免费| 尤物九九久久国产精品的特点| 久久久久久久网| 91久久精品国产| 小黄鸭精品aⅴ导航网站入口| 国产精品一区=区| 久久五月婷婷丁香社区| 99精品久久免费看蜜臀剧情介绍| 欧美主播一区二区三区| 在线精品视频在线观看高清| 免费黄网站欧美| 亚洲欧美日韩精品久久久| 六月婷婷一区| 亚洲尤物视频网| 亚洲国产成人久久| 国产伦精品一区二区三区免费迷 | 欧美专区在线观看| 91久久精品日日躁夜夜躁欧美| 欧美区视频在线观看| 久久美女性网| 久久精品91| 欧美在线播放视频| 日韩一级在线观看| 欧美激情按摩在线| 久久影视三级福利片| 欧美在线www| 午夜电影亚洲| 午夜精品久久久久久久蜜桃app| 日韩一级二级三级|