锘??xml version="1.0" encoding="utf-8" standalone="yes"?>亚洲欧美成人网,欧美亚一区二区,久久久蜜臀国产一区二区http://www.shnenglu.com/guijie/category/20090.html鏉板摜濂?鍝堝搱!zh-cnTue, 09 Apr 2019 13:20:33 GMTTue, 09 Apr 2019 13:20:33 GMT60How to solve AX + XB = C for X using matlab?http://www.shnenglu.com/guijie/archive/2015/07/06/211161.html鏉板摜鏉板摜Mon, 06 Jul 2015 07:28:00 GMThttp://www.shnenglu.com/guijie/archive/2015/07/06/211161.htmlhttp://www.shnenglu.com/guijie/comments/211161.htmlhttp://www.shnenglu.com/guijie/archive/2015/07/06/211161.html#Feedback0http://www.shnenglu.com/guijie/comments/commentRss/211161.htmlhttp://www.shnenglu.com/guijie/services/trackbacks/211161.htmlX = sylvester(A,B,C)
http://cn.mathworks.com/help/matlab/ref/sylvester.html

鏉板摜 2015-07-06 15:28 鍙戣〃璇勮
]]>
Alternating optimizationhttp://www.shnenglu.com/guijie/archive/2015/05/24/210729.html鏉板摜鏉板摜Sun, 24 May 2015 04:58:00 GMThttp://www.shnenglu.com/guijie/archive/2015/05/24/210729.htmlhttp://www.shnenglu.com/guijie/comments/210729.htmlhttp://www.shnenglu.com/guijie/archive/2015/05/24/210729.html#Feedback0http://www.shnenglu.com/guijie/comments/commentRss/210729.htmlhttp://www.shnenglu.com/guijie/services/trackbacks/210729.html      鎴戜釜浜虹悊瑙o紝榪欏嚑涓蹇甸兘鏄瓑浠風殑銆?br />

‘alternating optimization’ or ‘alternative optimization’?

Sue (UTS) comment: ‘Alternating’ means you use this optimization with another optimization, one after the other. ‘Alternative’ means you use this optimization instead of any other.

鎴戠殑GSM-PAF鏈鍚庣敤鐨?/span>‘alternating optimization’



鏉板摜 2015-05-24 12:58 鍙戣〃璇勮
]]>
瀹屽叏鎺屾彙 鏈澶т技鐒朵及璁?/title><link>http://www.shnenglu.com/guijie/archive/2013/12/05/204609.html</link><dc:creator>鏉板摜</dc:creator><author>鏉板摜</author><pubDate>Thu, 05 Dec 2013 11:21:00 GMT</pubDate><guid>http://www.shnenglu.com/guijie/archive/2013/12/05/204609.html</guid><wfw:comment>http://www.shnenglu.com/guijie/comments/204609.html</wfw:comment><comments>http://www.shnenglu.com/guijie/archive/2013/12/05/204609.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.shnenglu.com/guijie/comments/commentRss/204609.html</wfw:commentRss><trackback:ping>http://www.shnenglu.com/guijie/services/trackbacks/204609.html</trackback:ping><description><![CDATA[<div style="font-family: Verdana, Arial, Helvetica, sans-serif; line-height: 25px; background-color: #ffffff"> <div style="font-family: Verdana, Arial, Helvetica, sans-serif; line-height: 25px; background-color: #ffffff">榪欐槸灞炰簬姒傜巼璁轟笌鏁扮悊緇熻涓弬鏁頒及璁$殑鍐呭錛岃鏁欐潗絎竷绔燩168錛涙ā寮忚瘑鍒瑪璁扮殑Section 3.11.1(Section 3.11鍒癝ection 3.11.1鐨勫唴瀹瑰簲璇ヨ浣?<br />鎬葷粨錛氭渶澶т技鐒跺嚱鏁頒及璁℃硶錛岄鍏堟槸鍋囪鎵寰楃殑鏍鋒湰鏈嶄粠鏌愪竴鍒嗗竷錛岀洰鏍囨槸浼拌鍑鴻繖涓垎甯冧腑鐨勫弬鏁幫紝鏂規硶鏄緱鍒拌繖涓緇勬牱鏈殑姒傜巼鏈澶ф椂灝卞搴斾簡璇ユā鍨嬬殑鍙傛暟鍊鹼紝鍐欏嚭浼肩劧鍑芥暟錛屽啀姹傚鏁幫紙寰楀埌瀵規暟浼肩劧錛夛紝鍐嶆眰瀵規暟浼肩劧鍑芥暟鐨勫鉤鍧囷紙瀵規暟騫沖潎浼肩劧錛夛紝鍐嶅鍏舵眰瀵鹼紝寰楀嚭鍙傛暟鍊箋傜洰鍓嶆垜鐞嗚В鐨勯渶瑕佹眰瀵規暟鐨勫師鍥犳槸錛岄氬父姒傜巼鏄皬鏁幫紝榪炰箻涔嬪悗浼氶潪甯稿皬錛屽璁$畻鏈鴻岃█錛屽鏄撻犳垚嫻偣鏁頒笅婧紝鎵浠ョ敤浜嗗彇瀵規暟銆?br />Zhengxia涔熸彁鍒拌繃浼肩劧(likelihood)灝辨槸姒傜巼錛岃嫻嬪埌鐨勬鐜囥?br /><a >https://en.wikipedia.org/wiki/Likelihood_function</a></div></div><img src ="http://www.shnenglu.com/guijie/aggbug/204609.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.shnenglu.com/guijie/" target="_blank">鏉板摜</a> 2013-12-05 19:21 <a href="http://www.shnenglu.com/guijie/archive/2013/12/05/204609.html#Feedback" target="_blank" style="text-decoration:none;">鍙戣〃璇勮</a></div>]]></description></item><item><title>How to use matlab solve optimization quadratic?http://www.shnenglu.com/guijie/archive/2012/11/21/195475.html鏉板摜鏉板摜Wed, 21 Nov 2012 10:31:00 GMThttp://www.shnenglu.com/guijie/archive/2012/11/21/195475.htmlhttp://www.shnenglu.com/guijie/comments/195475.htmlhttp://www.shnenglu.com/guijie/archive/2012/11/21/195475.html#Feedback0http://www.shnenglu.com/guijie/comments/commentRss/195475.htmlhttp://www.shnenglu.com/guijie/services/trackbacks/195475.html

鏉板摜 2012-11-21 18:31 鍙戣〃璇勮
]]>
Taylor series in several variableshttp://www.shnenglu.com/guijie/archive/2012/10/31/194113.html鏉板摜鏉板摜Wed, 31 Oct 2012 02:48:00 GMThttp://www.shnenglu.com/guijie/archive/2012/10/31/194113.htmlhttp://www.shnenglu.com/guijie/comments/194113.htmlhttp://www.shnenglu.com/guijie/archive/2012/10/31/194113.html#Feedback0http://www.shnenglu.com/guijie/comments/commentRss/194113.htmlhttp://www.shnenglu.com/guijie/services/trackbacks/194113.htmlhttp://en.wikipedia.org/wiki/Taylor_series

Taylor series in several variables

The Taylor series may also be generalized to functions of more than one variable with

T(x_1,\dots,x_d) = \sum_{n_1=0}^\infty \sum_{n_2=0}^\infty \cdots \sum_{n_d = 0}^\infty  \frac{(x_1-a_1)^{n_1}\cdots (x_d-a_d)^{n_d}}{n_1!\cdots n_d!}\,\left(\frac{\partial^{n_1 + \cdots + n_d}f}{\partial x_1^{n_1}\cdots \partial x_d^{n_d}}\right)(a_1,\dots,a_d).\!

For example, for a function that depends on two variables, x and y, the Taylor series to second order about the point (ab) is:

 \begin{align} f(x,y) & \approx f(a,b) +(x-a)\, f_x(a,b) +(y-b)\, f_y(a,b) \\ & {}\quad + \frac{1}{2!}\left[ (x-a)^2\,f_{xx}(a,b) + 2(x-a)(y-b)\,f_{xy}(a,b) +(y-b)^2\, f_{yy}(a,b) \right], \end{align}

where the subscripts denote the respective partial derivatives.

A second-order Taylor series expansion of a scalar-valued function of more than one variable can be written compactly as

T(\mathbf{x}) = f(\mathbf{a}) + \mathrm{D} f(\mathbf{a})^T (\mathbf{x} - \mathbf{a})  + \frac{1}{2!} (\mathbf{x} - \mathbf{a})^T \,\{\mathrm{D}^2 f(\mathbf{a})\}\,(\mathbf{x} - \mathbf{a}) + \cdots\! \,,

where D f(\mathbf{a})\! is the gradient of \,f evaluated at \mathbf{x} = \mathbf{a} and D^2 f(\mathbf{a})\! is the Hessian matrix. Applying the multi-index notation the Taylor series for several variables becomes

T(\mathbf{x}) = \sum_{|\alpha| \ge 0}^{}\frac{(\mathbf{x}-\mathbf{a})^{\alpha}}{\alpha !}\,({\mathrm{\partial}^{\alpha}}\,f)(\mathbf{a})\,,

which is to be understood as a still more abbreviated multi-index version of the first equation of this paragraph, again in full analogy to the single variable case.

[edit]Example

Second-order Taylor series approximation (in gray) of a function f(x,y) = e^x\log{(1+y)}around origin.

Compute a second-order Taylor series expansion around point (a,b) = (0,0) of a function

f(x,y)=e^x\log(1+y).\,

Firstly, we compute all partial derivatives we need

f_x(a,b)=e^x\log(1+y)\bigg|_{(x,y)=(0,0)}=0\,,
f_y(a,b)=\frac{e^x}{1+y}\bigg|_{(x,y)=(0,0)}=1\,,
f_{xx}(a,b)=e^x\log(1+y)\bigg|_{(x,y)=(0,0)}=0\,,
f_{yy}(a,b)=-\frac{e^x}{(1+y)^2}\bigg|_{(x,y)=(0,0)}=-1\,,
f_{xy}(a,b)=f_{yx}(a,b)=\frac{e^x}{1+y}\bigg|_{(x,y)=(0,0)}=1.

The Taylor series is

\begin{align} T(x,y) = f(a,b) & +(x-a)\, f_x(a,b) +(y-b)\, f_y(a,b) \\ &+\frac{1}{2!}\left[ (x-a)^2\,f_{xx}(a,b) + 2(x-a)(y-b)\,f_{xy}(a,b) +(y-b)^2\, f_{yy}(a,b) \right]+ \cdots\,,\end{align}

which in this case becomes

\begin{align}T(x,y) &= 0 + 0(x-0) + 1(y-0) + \frac{1}{2}\Big[ 0(x-0)^2 + 2(x-0)(y-0) + (-1)(y-0)^2 \Big] + \cdots \\ &= y + xy - \frac{y^2}{2} + \cdots. \end{align}

Since log(1 + y) is analytic in |y| < 1, we have

e^x\log(1+y)= y + xy - \frac{y^2}{2} + \cdots

for |y| < 1.



鏉板摜 2012-10-31 10:48 鍙戣〃璇勮
]]>
Jensen's inequalityhttp://www.shnenglu.com/guijie/archive/2012/10/30/194080.html鏉板摜鏉板摜Tue, 30 Oct 2012 04:04:00 GMThttp://www.shnenglu.com/guijie/archive/2012/10/30/194080.htmlhttp://www.shnenglu.com/guijie/comments/194080.htmlhttp://www.shnenglu.com/guijie/archive/2012/10/30/194080.html#Feedback0http://www.shnenglu.com/guijie/comments/commentRss/194080.htmlhttp://www.shnenglu.com/guijie/services/trackbacks/194080.html

If λ1 and λ2 are two arbitrary nonnegative real numbers such that λ1 + λ2 = 1 then convexity of \scriptstyle\varphi implies

\varphi(\lambda_1 x_1+\lambda_2 x_2)\leq \lambda_1\,\varphi(x_1)+\lambda_2\,\varphi(x_2)\text{ for any }x_1,\,x_2.  [榪欏氨鏄嚫鍑芥暟鐨勫畾涔塢

This can be easily generalized: if λ1λ2, ..., λn are nonnegative real numbers such that λ1 + ... + λn = 1, then

\varphi(\lambda_1 x_1+\lambda_2 x_2+\cdots+\lambda_n x_n)\leq \lambda_1\,\varphi(x_1)+\lambda_2\,\varphi(x_2)+\cdots+\lambda_n\,\varphi(x_n),

渚嬪-log(x)鏄嚫鍑芥暟


鏉板摜 2012-10-30 12:04 鍙戣〃璇勮
]]>
Gradient Descent(姊害涓嬮檷娉?(涓や緥瀵瑰簲涓ょ墰鏂囧潎鐢ㄨ娉曟眰瑙g洰鏍囧嚱鏁?http://www.shnenglu.com/guijie/archive/2012/10/19/193522.html鏉板摜鏉板摜Fri, 19 Oct 2012 05:33:00 GMThttp://www.shnenglu.com/guijie/archive/2012/10/19/193522.htmlhttp://www.shnenglu.com/guijie/comments/193522.htmlhttp://www.shnenglu.com/guijie/archive/2012/10/19/193522.html#Feedback0http://www.shnenglu.com/guijie/comments/commentRss/193522.htmlhttp://www.shnenglu.com/guijie/services/trackbacks/193522.htmlhttp://en.wikipedia.org/wiki/Gradient_descent 
http://zh.wikipedia.org/wiki/%E6%9C%80%E9%80%9F%E4%B8%8B%E9%99%8D%E6%B3%95
 Gradient descent is based on the observation that if the multivariable function F(\mathbf{x}) is defined and differentiable in a neighborhood of a point \mathbf{a}, then F(\mathbf{x}) decreases fastest if one goes from \mathbf{a} in the direction of the negative gradient of F at \mathbf{a}-\nabla F(\mathbf{a}) 
涓哄暐姝ラ暱瑕佸彉鍖栵紵Tianyi鐨勮В閲婂緢濂斤細濡傛灉姝ラ暱榪囧ぇ錛屽彲鑳戒嬌寰楀嚱鏁板間笂鍗囷紝鏁呰鍑忓皬姝ラ暱 (涓嬮潰榪欎釜鍥劇墖鏄湪綰鎬笂鐢誨ソ錛岀劧鍚巗can鐨?銆?br />Andrew NG鐨刢oursera璇劇▼Machine learning鐨?span style="text-align: justify; text-transform: none; background-color: rgb(255,255,255); text-indent: 0px; letter-spacing: normal; display: inline !important; font: 13px/18px Verdana, Helvetica, Arial; white-space: normal; float: none; color: rgb(94,94,94); word-spacing: 0px; -webkit-text-stroke-width: 0px">II. Linear Regression with One Variable鐨?span style="font-family: 'Calibri','sans-serif'; font-size: 10.5pt; mso-bidi-font-size: 11.0pt; mso-ascii-theme-font: minor-latin; mso-fareast-font-family: 瀹嬩綋; mso-fareast-theme-font: minor-fareast; mso-hansi-theme-font: minor-latin; mso-bidi-font-family: 'Times New Roman'; mso-bidi-theme-font: minor-bidi; mso-ansi-language: EN-US; mso-fareast-language: ZH-CN; mso-bidi-language: AR-SA" lang="EN-US">Gradient descent Intuition涓殑瑙i噴寰堝ソ錛屾瘮濡傚湪涓嬪浘鍦ㄥ彸渚х殑鐐癸紝鍒欐搴︽槸姝f暟錛?font size="2" face="Arial"> -\nabla F(\mathbf{a})鏄礋鏁幫紝鍗充嬌褰撳墠鐨刟鍑忓皬
渚?錛歍oward the Optimization of Normalized Graph Laplacian(TNN 2011)鐨凢ig. 1. Normalized graph Laplacian learning algorithm鏄緢濂界殑姊害涓嬮檷娉曠殑渚嬪瓙.鍙鐪婩ig1錛屽叾浠栦笉蹇呯湅銆侳ig1闄禨huning鑰佸笀璇句歡 闈炵嚎鎬т紭鍖栫鍏〉絎洓涓猵pt錛屽搴旀暀鏉怭124錛屽叧閿洿綰挎悳绱㈢瓥鐣ワ紝搴旂敤 闈炵嚎鎬т紭鍖栫鍥涢〉絎洓涓猵pt錛屾闀垮姞鍊嶆垨鍑忓嶃傚彧瑕佺洰鏍囧噺灝戝氨鍒頒笅涓涓悳绱㈢偣錛屽茍涓旀闀垮姞鍊嶏紱鍚﹀垯鍋滅暀鍦ㄥ師鐐癸紝灝嗘闀垮噺鍊嶃?br />渚?錛?nbsp;Distance Metric Learning for Large Margin Nearest Neighbor Classification(JLMR),鐩爣鍑芥暟灝辨槸鍏紡14錛屾槸鐭╅樀M鐨勪簩嬈″瀷錛屽睍寮鍚庡氨浼氬彂鐜幫紝鍏充簬M鏄嚎鎬х殑錛屾晠鏄嚫鐨勩傚M姹傚鐨勭粨鏋滐紝闄勫綍鍏紡18鍜?9涔嬮棿鐨勫叕寮忎腑娌℃湁M

鎴戣嚜宸遍澶栫殑鎬濊冿細濡傛灉鏄嚫鍑芥暟錛屽鑷彉閲忔眰鍋忓涓?錛岀劧鍚庡皢鑷彉閲忔眰鍑烘潵涓嶅氨琛屼簡鍢涳紝涓哄暐榪樿姊害涓嬮檷錛熶笂榪頒緥浜屾槸涓嶈鐨勶紝鍥犱負瀵筂姹傚鍚庝笌M鏃犲叧浜嗐傚拰tianyi璁ㄨ錛屾鍥犱負姹傚涓? 娌℃湁瑙f瀽瑙i噰鐢ㄦ搴︿笅闄嶏紝鏈夎В鏋愯В灝辯粨鏉熶簡

http://blog.csdn.net/yudingjun0611/article/details/8147046

1. 姊害涓嬮檷娉?/strong>

姊害涓嬮檷娉曠殑鍘熺悊鍙互鍙傝冿細鏂潶紱忔満鍣ㄥ涔犵涓璁?/a>銆?/span>

鎴戝疄楠屾墍鐢ㄧ殑鏁版嵁鏄?00涓簩緇寸偣銆?/span>

濡傛灉姊害涓嬮檷綆楁硶涓嶈兘姝e父榪愯錛岃冭檻浣跨敤鏇村皬鐨勬闀?涔熷氨鏄涔犵巼)錛岃繖閲岄渶瑕佹敞鎰忎袱鐐癸細

1錛夊浜庤凍澶熷皬鐨?  鑳戒繚璇佸湪姣忎竴姝ラ兘鍑忓皬錛?/span>
2錛変絾鏄鏋滃お灝忥紝姊害涓嬮檷綆楁硶鏀舵暃鐨勪細寰堟參錛?/span>

鎬葷粨錛?/span>
1錛夊鏋滃お灝忥紝灝變細鏀舵暃寰堟參錛?/span>
2錛夊鏋滃お澶э紝灝變笉鑳戒繚璇佹瘡涓嬈¤凱浠i兘鍑忓皬錛屼篃灝變笉鑳戒繚璇佹敹鏁涳紱
濡備綍閫夋嫨-緇忛獙鐨勬柟娉曪細
..., 0.001, 0.003, 0.01, 0.03, 0.1, 0.3, 1...
綰?鍊嶄簬鍓嶄竴涓暟銆?/span>

matlab婧愮爜錛?/span>

  1. function [theta0,theta1]=Gradient_descent(X,Y);  
  2. theta0=0;  
  3. theta1=0;  
  4. t0=0;  
  5. t1=0;  
  6. while(1)  
  7.     for i=1:1:100 %100涓偣  
  8.         t0=t0+(theta0+theta1*X(i,1)-Y(i,1))*1;  
  9.         t1=t1+(theta0+theta1*X(i,1)-Y(i,1))*X(i,1);  
  10.     end  
  11.     old_theta0=theta0;  
  12.     old_theta1=theta1;  
  13.     theta0=theta0-0.000001*t0 %0.000001琛ㄧず瀛︿範鐜?nbsp; 
  14.     theta1=theta1-0.000001*t1  
  15.     t0=0;  
  16.     t1=0;  
  17.     if(sqrt((old_theta0-theta0)^2+(old_theta1-theta1)^2)<0.000001) % 榪欓噷鏄垽鏂敹鏁涚殑鏉′歡錛屽綋鐒跺彲浠ユ湁鍏朵粬鏂規硶鏉ュ仛  
  18.         break;  
  19.     end  
  20. end  


2. 闅忔満姊害涓嬮檷娉?/strong>

闅忔満姊害涓嬮檷娉曢傜敤浜庢牱鏈偣鏁伴噺闈炲父搴炲ぇ鐨勬儏鍐碉紝綆楁硶浣垮緱鎬諱綋鍚戠潃姊害涓嬮檷蹇殑鏂瑰悜涓嬮檷銆?/span>

matlab婧愮爜錛?/span>

  1. function [theta0,theta1]=Gradient_descent_rand(X,Y);  
  2. theta0=0;  
  3. theta1=0;  
  4. t0=theta0;  
  5. t1=theta1;  
  6. for i=1:1:100  
  7.     t0=theta0-0.01*(theta0+theta1*X(i,1)-Y(i,1))*1  
  8.     t1=theta1-0.01*(theta0+theta1*X(i,1)-Y(i,1))*X(i,1)  
  9.     theta0=t0  
  10.     theta1=t1  
  11. end  



鏉板摜 2012-10-19 13:33 鍙戣〃璇勮
]]>
[zz]Newton Raphson綆楁硶http://www.shnenglu.com/guijie/archive/2012/10/16/193347.html鏉板摜鏉板摜Mon, 15 Oct 2012 23:21:00 GMThttp://www.shnenglu.com/guijie/archive/2012/10/16/193347.htmlhttp://www.shnenglu.com/guijie/comments/193347.htmlhttp://www.shnenglu.com/guijie/archive/2012/10/16/193347.html#Feedback0http://www.shnenglu.com/guijie/comments/commentRss/193347.htmlhttp://www.shnenglu.com/guijie/services/trackbacks/193347.htmlhttp://blog.csdn.net/flyingworm_eley/article/details/6517853 

Newton-Raphson綆楁硶鍦ㄧ粺璁′腑騫挎硾搴旂敤浜庢眰瑙LE鐨勫弬鏁頒及璁°?/p>

瀵瑰簲鐨勫崟鍙橀噺濡備笅鍥撅細

 

澶氬厓鍑芥暟綆楁硶錛?/p>

 

 

 

Example錛氾紙implemented in R錛?/p>

#瀹氫箟鍑芥暟f(x)

f=function(x){
    1/x+1/(1-x)
}

#瀹氫箟f_d1涓轟竴闃跺鍑芥暟

f_d1=function(x){
    -1/x^2+1/(x-1)^2
}

#瀹氫箟f_d2涓轟簩闃跺鍑芥暟

f_d2=function(x){
    2/x^3-2/(x-1)^3
}

 

#NR綆楁硶銆
NR=function(time,init){
    X=NULL
    D1=NULL   #鍌ㄥ瓨Xi涓闃跺鍑芥暟鍊?br />D2=NULL   #鍌ㄥ瓨Xi浜岄樁瀵煎嚱鏁板?br />    count=0

    X[1]=init
    l=seq(0.02,0.98,0.0002)
    plot(l,f(l),pch='.')
    points(X[1],f(X[1]),pch=2,col=1)

 

    for (i in 2:time){
        D1[i-1]=f_d1(X[i-1])
        D2[i-1]=f_d2(X[i-1])
        X[i]=X[i-1]-1/(D2[i-1])*(D1[i-1])   #NR綆楁硶榪唬寮?br />        if (abs(D1[i-1])<0.05)break 
        points(X[i],f(X[i]),pch=2,col=i)
        count=count+1
    }
    return(list(x=X,Deriviative_1=D,deriviative2=D2,count))
}


o=NR(30,0.9)

緇撴灉濡備笅鍥撅細鍥句腑涓嶅悓棰滆壊鐨勪笁瑙掑艦琛ㄧずi嬈¤凱浠d駭鐢熺殑浼拌鍊糥i

 

 

o=NR(30,0.9)

 

#鍙﹀彇鍑芥暟f(x)

f=function(x){
    return(exp(3.5*cos(x))+4*sin(x))
}

 

f_d1=function(x){
    return(-3.5*exp(3.5*cos(x))*sin(x)+4*cos(x))
}

 

f_d2=function(x){
    return(-4*sin(x)+3.5^2*exp(3.5*cos(x))*(sin(x))^2-3.5*exp(3.5*cos(x))*cos(x))
}

 

寰楀埌緇撴灉濡備笅錛?/p>

Reference from:

Kevin Quinn

Assistant Professor

Univ Washington



鏉板摜 2012-10-16 07:21 鍙戣〃璇勮
]]>
青青草原综合久久大伊人导航_色综合久久天天综合_日日噜噜夜夜狠狠久久丁香五月_热久久这里只有精品
  • <ins id="pjuwb"></ins>
    <blockquote id="pjuwb"><pre id="pjuwb"></pre></blockquote>
      <noscript id="pjuwb"></noscript>
            <sup id="pjuwb"><pre id="pjuwb"></pre></sup>
              <dd id="pjuwb"></dd>
              <abbr id="pjuwb"></abbr>
              久久久久亚洲综合| 亚洲自啪免费| 欧美高清在线一区| 牛牛国产精品| 中国女人久久久| 一区二区三区 在线观看视频| 欧美精品日韩一本| 亚洲欧美日韩人成在线播放| 亚洲午夜精品一区二区三区他趣| 国产精品久久一区主播| 久久福利影视| 噜噜爱69成人精品| 99日韩精品| 亚洲午夜电影网| 国产婷婷成人久久av免费高清| 欧美在线资源| 欧美阿v一级看视频| 亚洲一区二区三区在线看| 亚洲一区二区四区| 亚洲第一精品影视| 一区二区欧美日韩| 一区二区亚洲精品| 日韩午夜电影av| 国产一区二区日韩精品| 亚洲黄色成人网| 国产精品国产三级国产普通话蜜臀| 性色av一区二区三区在线观看| 久久久精品一区| 亚洲天堂av在线免费| 久久精品五月婷婷| 亚洲性线免费观看视频成熟| 久久激情五月婷婷| 亚洲一区二区三区涩| 美女国产精品| 欧美一区二区视频免费观看| 久久久久一区二区| 性做久久久久久久久| 欧美黄污视频| 久久精品视频va| 欧美日韩在线直播| 欧美成人情趣视频| 国产在线视频欧美一区二区三区| 免费在线欧美黄色| 国产伦一区二区三区色一情| 91久久香蕉国产日韩欧美9色| 国产欧美日本在线| 一本色道久久综合一区| 亚洲巨乳在线| 亚洲黄色免费网站| 久久久久久尹人网香蕉| 久久久久免费| 国产精品免费视频xxxx| 99riav国产精品| 日韩一区二区免费看| 快播亚洲色图| 欧美1区2区视频| 精品va天堂亚洲国产| 午夜在线电影亚洲一区| 午夜亚洲一区| 国产精品视频不卡| 亚洲小视频在线观看| 亚洲免费在线视频一区 二区| 欧美精品一区二区三区四区| 欧美黄色一区| 亚洲精品视频啊美女在线直播| 裸体丰满少妇做受久久99精品| 久久久噜噜噜久久| 一区二区三区亚洲| 久久久91精品国产| 欧美a级片一区| **网站欧美大片在线观看| 久久久久一区| 亚洲高清免费| 亚洲最新在线| 国产精品久久久久一区二区三区共 | 欧美日韩和欧美的一区二区| 亚洲国产精品一区制服丝袜| 亚洲日本va午夜在线电影 | 亚洲精品一区在线观看| 亚洲最新视频在线播放| 国产精品久久久久久久久搜平片| 中文国产亚洲喷潮| 欧美在线二区| 亚洲国产裸拍裸体视频在线观看乱了中文 | 欧美丰满高潮xxxx喷水动漫| 亚洲人成在线观看网站高清| 欧美精品日韩三级| 亚洲尤物影院| 欧美va亚洲va日韩∨a综合色| 亚洲国产视频一区二区| 欧美久久久久久蜜桃| 亚洲一区二区三区在线看| 久久偷窥视频| 99国产成+人+综合+亚洲欧美| 欧美性一二三区| 久久大综合网| 亚洲美女视频在线免费观看| 欧美一区免费视频| 亚洲欧洲一区二区三区| 国产精品乱码久久久久久| 久久国产精品免费一区| 日韩网站免费观看| 久久久欧美精品| 一区二区三区欧美视频| 国产视频一区欧美| 欧美精品久久一区二区| 欧美尤物巨大精品爽| 亚洲六月丁香色婷婷综合久久| 久久精品人人做人人综合| 亚洲午夜精品一区二区| 欧美多人爱爱视频网站| 新67194成人永久网站| 亚洲国产日韩精品| 国产午夜精品一区理论片飘花| 欧美精品电影在线| 久久人人97超碰人人澡爱香蕉| 99国产精品国产精品毛片| 美国十次成人| 久久精品一区中文字幕| 亚洲一区二区三区国产| 亚洲精品影视| 精品91久久久久| 国产日本欧美视频| 欧美性猛交xxxx免费看久久久| 久久中文字幕导航| 欧美一区二区黄| 在线综合亚洲| 日韩系列欧美系列| 亚洲欧洲日产国产综合网| 免费在线视频一区| 美女视频黄 久久| 久久精品首页| 欧美一区午夜精品| 亚洲综合色丁香婷婷六月图片| 最新国产成人在线观看| 精品91久久久久| 国产一区二区三区在线观看网站| 国产精品久久久久久超碰| 欧美剧在线免费观看网站| 欧美高清视频一区二区| 男人插女人欧美| 免费视频一区二区三区在线观看| 久久精品九九| 久久久国产一区二区| 久久久女女女女999久久| 久久精品国产欧美激情| 欧美专区在线观看一区| 欧美亚洲免费| 性伦欧美刺激片在线观看| 亚洲欧美自拍偷拍| 欧美一区三区二区在线观看| 亚洲欧美怡红院| 欧美主播一区二区三区美女 久久精品人| 一区二区三区欧美| 亚洲欧美日韩精品久久奇米色影视| 亚洲一级在线观看| 香蕉久久精品日日躁夜夜躁| 欧美在线观看一区二区三区| 久久久99精品免费观看不卡| 久久久国产精彩视频美女艺术照福利| 久久精品日韩欧美| 欧美激情1区2区3区| 欧美日韩成人综合在线一区二区 | 久久人人97超碰精品888| 免费视频久久| 国产精品久久二区| 激情综合激情| 99在线视频精品| 午夜精品短视频| 免费av成人在线| 亚洲乱亚洲高清| 欧美一区二区精美| 你懂的成人av| 国产伦精品一区二区三区四区免费 | 欧美日本在线视频| 国产精品日日做人人爱 | 欧美顶级艳妇交换群宴| 欧美日韩一卡| 黄色亚洲在线| 一区二区三区视频在线播放| 欧美在线视频一区| 亚洲黄色成人久久久| 亚洲欧美一区二区三区极速播放| 久久久久国内| 国产精品国产a级| 在线观看91精品国产入口| 亚洲午夜精品视频| 免费成人高清视频| 亚洲一区免费视频| 欧美国产先锋| 国产自产2019最新不卡| 一本色道久久综合亚洲精品高清 | 亚洲高清资源综合久久精品| 亚洲中无吗在线| 亚洲大胆人体视频| 午夜精品国产| 欧美日韩调教| 亚洲精品日韩在线观看| 久久久亚洲国产天美传媒修理工| aa亚洲婷婷|