锘??xml version="1.0" encoding="utf-8" standalone="yes"?>老色鬼久久亚洲AV综合,久久r热这里有精品视频,99精品伊人久久久大香线蕉http://www.shnenglu.com/guijie/category/20090.html鏉板摜濂?鍝堝搱!zh-cnTue, 09 Apr 2019 13:20:33 GMTTue, 09 Apr 2019 13:20:33 GMT60How to solve AX + XB = C for X using matlab?http://www.shnenglu.com/guijie/archive/2015/07/06/211161.html鏉板摜鏉板摜Mon, 06 Jul 2015 07:28:00 GMThttp://www.shnenglu.com/guijie/archive/2015/07/06/211161.htmlhttp://www.shnenglu.com/guijie/comments/211161.htmlhttp://www.shnenglu.com/guijie/archive/2015/07/06/211161.html#Feedback0http://www.shnenglu.com/guijie/comments/commentRss/211161.htmlhttp://www.shnenglu.com/guijie/services/trackbacks/211161.htmlX = sylvester(A,B,C)
http://cn.mathworks.com/help/matlab/ref/sylvester.html

鏉板摜 2015-07-06 15:28 鍙戣〃璇勮
]]>
Alternating optimizationhttp://www.shnenglu.com/guijie/archive/2015/05/24/210729.html鏉板摜鏉板摜Sun, 24 May 2015 04:58:00 GMThttp://www.shnenglu.com/guijie/archive/2015/05/24/210729.htmlhttp://www.shnenglu.com/guijie/comments/210729.htmlhttp://www.shnenglu.com/guijie/archive/2015/05/24/210729.html#Feedback0http://www.shnenglu.com/guijie/comments/commentRss/210729.htmlhttp://www.shnenglu.com/guijie/services/trackbacks/210729.html      鎴戜釜浜虹悊瑙o紝榪欏嚑涓蹇甸兘鏄瓑浠風殑銆?br />

‘alternating optimization’ or ‘alternative optimization’?

Sue (UTS) comment: ‘Alternating’ means you use this optimization with another optimization, one after the other. ‘Alternative’ means you use this optimization instead of any other.

鎴戠殑GSM-PAF鏈鍚庣敤鐨?/span>‘alternating optimization’



鏉板摜 2015-05-24 12:58 鍙戣〃璇勮
]]>
瀹屽叏鎺屾彙 鏈澶т技鐒朵及璁?/title><link>http://www.shnenglu.com/guijie/archive/2013/12/05/204609.html</link><dc:creator>鏉板摜</dc:creator><author>鏉板摜</author><pubDate>Thu, 05 Dec 2013 11:21:00 GMT</pubDate><guid>http://www.shnenglu.com/guijie/archive/2013/12/05/204609.html</guid><wfw:comment>http://www.shnenglu.com/guijie/comments/204609.html</wfw:comment><comments>http://www.shnenglu.com/guijie/archive/2013/12/05/204609.html#Feedback</comments><slash:comments>0</slash:comments><wfw:commentRss>http://www.shnenglu.com/guijie/comments/commentRss/204609.html</wfw:commentRss><trackback:ping>http://www.shnenglu.com/guijie/services/trackbacks/204609.html</trackback:ping><description><![CDATA[<div style="font-family: Verdana, Arial, Helvetica, sans-serif; line-height: 25px; background-color: #ffffff"> <div style="font-family: Verdana, Arial, Helvetica, sans-serif; line-height: 25px; background-color: #ffffff">榪欐槸灞炰簬姒傜巼璁轟笌鏁扮悊緇熻涓弬鏁頒及璁$殑鍐呭錛岃鏁欐潗絎竷绔燩168錛涙ā寮忚瘑鍒瑪璁扮殑Section 3.11.1(Section 3.11鍒癝ection 3.11.1鐨勫唴瀹瑰簲璇ヨ浣?<br />鎬葷粨錛氭渶澶т技鐒跺嚱鏁頒及璁℃硶錛岄鍏堟槸鍋囪鎵寰楃殑鏍鋒湰鏈嶄粠鏌愪竴鍒嗗竷錛岀洰鏍囨槸浼拌鍑鴻繖涓垎甯冧腑鐨勫弬鏁幫紝鏂規硶鏄緱鍒拌繖涓緇勬牱鏈殑姒傜巼鏈澶ф椂灝卞搴斾簡璇ユā鍨嬬殑鍙傛暟鍊鹼紝鍐欏嚭浼肩劧鍑芥暟錛屽啀姹傚鏁幫紙寰楀埌瀵規暟浼肩劧錛夛紝鍐嶆眰瀵規暟浼肩劧鍑芥暟鐨勫鉤鍧囷紙瀵規暟騫沖潎浼肩劧錛夛紝鍐嶅鍏舵眰瀵鹼紝寰楀嚭鍙傛暟鍊箋傜洰鍓嶆垜鐞嗚В鐨勯渶瑕佹眰瀵規暟鐨勫師鍥犳槸錛岄氬父姒傜巼鏄皬鏁幫紝榪炰箻涔嬪悗浼氶潪甯稿皬錛屽璁$畻鏈鴻岃█錛屽鏄撻犳垚嫻偣鏁頒笅婧紝鎵浠ョ敤浜嗗彇瀵規暟銆?br />Zhengxia涔熸彁鍒拌繃浼肩劧(likelihood)灝辨槸姒傜巼錛岃嫻嬪埌鐨勬鐜囥?br /><a >https://en.wikipedia.org/wiki/Likelihood_function</a></div></div><img src ="http://www.shnenglu.com/guijie/aggbug/204609.html" width = "1" height = "1" /><br><br><div align=right><a style="text-decoration:none;" href="http://www.shnenglu.com/guijie/" target="_blank">鏉板摜</a> 2013-12-05 19:21 <a href="http://www.shnenglu.com/guijie/archive/2013/12/05/204609.html#Feedback" target="_blank" style="text-decoration:none;">鍙戣〃璇勮</a></div>]]></description></item><item><title>How to use matlab solve optimization quadratic?http://www.shnenglu.com/guijie/archive/2012/11/21/195475.html鏉板摜鏉板摜Wed, 21 Nov 2012 10:31:00 GMThttp://www.shnenglu.com/guijie/archive/2012/11/21/195475.htmlhttp://www.shnenglu.com/guijie/comments/195475.htmlhttp://www.shnenglu.com/guijie/archive/2012/11/21/195475.html#Feedback0http://www.shnenglu.com/guijie/comments/commentRss/195475.htmlhttp://www.shnenglu.com/guijie/services/trackbacks/195475.html

鏉板摜 2012-11-21 18:31 鍙戣〃璇勮
]]>
Taylor series in several variableshttp://www.shnenglu.com/guijie/archive/2012/10/31/194113.html鏉板摜鏉板摜Wed, 31 Oct 2012 02:48:00 GMThttp://www.shnenglu.com/guijie/archive/2012/10/31/194113.htmlhttp://www.shnenglu.com/guijie/comments/194113.htmlhttp://www.shnenglu.com/guijie/archive/2012/10/31/194113.html#Feedback0http://www.shnenglu.com/guijie/comments/commentRss/194113.htmlhttp://www.shnenglu.com/guijie/services/trackbacks/194113.htmlhttp://en.wikipedia.org/wiki/Taylor_series

Taylor series in several variables

The Taylor series may also be generalized to functions of more than one variable with

T(x_1,\dots,x_d) = \sum_{n_1=0}^\infty \sum_{n_2=0}^\infty \cdots \sum_{n_d = 0}^\infty  \frac{(x_1-a_1)^{n_1}\cdots (x_d-a_d)^{n_d}}{n_1!\cdots n_d!}\,\left(\frac{\partial^{n_1 + \cdots + n_d}f}{\partial x_1^{n_1}\cdots \partial x_d^{n_d}}\right)(a_1,\dots,a_d).\!

For example, for a function that depends on two variables, x and y, the Taylor series to second order about the point (ab) is:

 \begin{align} f(x,y) & \approx f(a,b) +(x-a)\, f_x(a,b) +(y-b)\, f_y(a,b) \\ & {}\quad + \frac{1}{2!}\left[ (x-a)^2\,f_{xx}(a,b) + 2(x-a)(y-b)\,f_{xy}(a,b) +(y-b)^2\, f_{yy}(a,b) \right], \end{align}

where the subscripts denote the respective partial derivatives.

A second-order Taylor series expansion of a scalar-valued function of more than one variable can be written compactly as

T(\mathbf{x}) = f(\mathbf{a}) + \mathrm{D} f(\mathbf{a})^T (\mathbf{x} - \mathbf{a})  + \frac{1}{2!} (\mathbf{x} - \mathbf{a})^T \,\{\mathrm{D}^2 f(\mathbf{a})\}\,(\mathbf{x} - \mathbf{a}) + \cdots\! \,,

where D f(\mathbf{a})\! is the gradient of \,f evaluated at \mathbf{x} = \mathbf{a} and D^2 f(\mathbf{a})\! is the Hessian matrix. Applying the multi-index notation the Taylor series for several variables becomes

T(\mathbf{x}) = \sum_{|\alpha| \ge 0}^{}\frac{(\mathbf{x}-\mathbf{a})^{\alpha}}{\alpha !}\,({\mathrm{\partial}^{\alpha}}\,f)(\mathbf{a})\,,

which is to be understood as a still more abbreviated multi-index version of the first equation of this paragraph, again in full analogy to the single variable case.

[edit]Example

Second-order Taylor series approximation (in gray) of a function f(x,y) = e^x\log{(1+y)}around origin.

Compute a second-order Taylor series expansion around point (a,b) = (0,0) of a function

f(x,y)=e^x\log(1+y).\,

Firstly, we compute all partial derivatives we need

f_x(a,b)=e^x\log(1+y)\bigg|_{(x,y)=(0,0)}=0\,,
f_y(a,b)=\frac{e^x}{1+y}\bigg|_{(x,y)=(0,0)}=1\,,
f_{xx}(a,b)=e^x\log(1+y)\bigg|_{(x,y)=(0,0)}=0\,,
f_{yy}(a,b)=-\frac{e^x}{(1+y)^2}\bigg|_{(x,y)=(0,0)}=-1\,,
f_{xy}(a,b)=f_{yx}(a,b)=\frac{e^x}{1+y}\bigg|_{(x,y)=(0,0)}=1.

The Taylor series is

\begin{align} T(x,y) = f(a,b) & +(x-a)\, f_x(a,b) +(y-b)\, f_y(a,b) \\ &+\frac{1}{2!}\left[ (x-a)^2\,f_{xx}(a,b) + 2(x-a)(y-b)\,f_{xy}(a,b) +(y-b)^2\, f_{yy}(a,b) \right]+ \cdots\,,\end{align}

which in this case becomes

\begin{align}T(x,y) &= 0 + 0(x-0) + 1(y-0) + \frac{1}{2}\Big[ 0(x-0)^2 + 2(x-0)(y-0) + (-1)(y-0)^2 \Big] + \cdots \\ &= y + xy - \frac{y^2}{2} + \cdots. \end{align}

Since log(1 + y) is analytic in |y| < 1, we have

e^x\log(1+y)= y + xy - \frac{y^2}{2} + \cdots

for |y| < 1.



鏉板摜 2012-10-31 10:48 鍙戣〃璇勮
]]>
Jensen's inequalityhttp://www.shnenglu.com/guijie/archive/2012/10/30/194080.html鏉板摜鏉板摜Tue, 30 Oct 2012 04:04:00 GMThttp://www.shnenglu.com/guijie/archive/2012/10/30/194080.htmlhttp://www.shnenglu.com/guijie/comments/194080.htmlhttp://www.shnenglu.com/guijie/archive/2012/10/30/194080.html#Feedback0http://www.shnenglu.com/guijie/comments/commentRss/194080.htmlhttp://www.shnenglu.com/guijie/services/trackbacks/194080.html

If λ1 and λ2 are two arbitrary nonnegative real numbers such that λ1 + λ2 = 1 then convexity of \scriptstyle\varphi implies

\varphi(\lambda_1 x_1+\lambda_2 x_2)\leq \lambda_1\,\varphi(x_1)+\lambda_2\,\varphi(x_2)\text{ for any }x_1,\,x_2.  [榪欏氨鏄嚫鍑芥暟鐨勫畾涔塢

This can be easily generalized: if λ1λ2, ..., λn are nonnegative real numbers such that λ1 + ... + λn = 1, then

\varphi(\lambda_1 x_1+\lambda_2 x_2+\cdots+\lambda_n x_n)\leq \lambda_1\,\varphi(x_1)+\lambda_2\,\varphi(x_2)+\cdots+\lambda_n\,\varphi(x_n),

渚嬪-log(x)鏄嚫鍑芥暟


鏉板摜 2012-10-30 12:04 鍙戣〃璇勮
]]>
Gradient Descent(姊害涓嬮檷娉?(涓や緥瀵瑰簲涓ょ墰鏂囧潎鐢ㄨ娉曟眰瑙g洰鏍囧嚱鏁?http://www.shnenglu.com/guijie/archive/2012/10/19/193522.html鏉板摜鏉板摜Fri, 19 Oct 2012 05:33:00 GMThttp://www.shnenglu.com/guijie/archive/2012/10/19/193522.htmlhttp://www.shnenglu.com/guijie/comments/193522.htmlhttp://www.shnenglu.com/guijie/archive/2012/10/19/193522.html#Feedback0http://www.shnenglu.com/guijie/comments/commentRss/193522.htmlhttp://www.shnenglu.com/guijie/services/trackbacks/193522.htmlhttp://en.wikipedia.org/wiki/Gradient_descent 
http://zh.wikipedia.org/wiki/%E6%9C%80%E9%80%9F%E4%B8%8B%E9%99%8D%E6%B3%95
 Gradient descent is based on the observation that if the multivariable function F(\mathbf{x}) is defined and differentiable in a neighborhood of a point \mathbf{a}, then F(\mathbf{x}) decreases fastest if one goes from \mathbf{a} in the direction of the negative gradient of F at \mathbf{a}-\nabla F(\mathbf{a}) 
涓哄暐姝ラ暱瑕佸彉鍖栵紵Tianyi鐨勮В閲婂緢濂斤細濡傛灉姝ラ暱榪囧ぇ錛屽彲鑳戒嬌寰楀嚱鏁板間笂鍗囷紝鏁呰鍑忓皬姝ラ暱 (涓嬮潰榪欎釜鍥劇墖鏄湪綰鎬笂鐢誨ソ錛岀劧鍚巗can鐨?銆?br />Andrew NG鐨刢oursera璇劇▼Machine learning鐨?span style="text-align: justify; text-transform: none; background-color: rgb(255,255,255); text-indent: 0px; letter-spacing: normal; display: inline !important; font: 13px/18px Verdana, Helvetica, Arial; white-space: normal; float: none; color: rgb(94,94,94); word-spacing: 0px; -webkit-text-stroke-width: 0px">II. Linear Regression with One Variable鐨?span style="font-family: 'Calibri','sans-serif'; font-size: 10.5pt; mso-bidi-font-size: 11.0pt; mso-ascii-theme-font: minor-latin; mso-fareast-font-family: 瀹嬩綋; mso-fareast-theme-font: minor-fareast; mso-hansi-theme-font: minor-latin; mso-bidi-font-family: 'Times New Roman'; mso-bidi-theme-font: minor-bidi; mso-ansi-language: EN-US; mso-fareast-language: ZH-CN; mso-bidi-language: AR-SA" lang="EN-US">Gradient descent Intuition涓殑瑙i噴寰堝ソ錛屾瘮濡傚湪涓嬪浘鍦ㄥ彸渚х殑鐐癸紝鍒欐搴︽槸姝f暟錛?font size="2" face="Arial"> -\nabla F(\mathbf{a})鏄礋鏁幫紝鍗充嬌褰撳墠鐨刟鍑忓皬
渚?錛歍oward the Optimization of Normalized Graph Laplacian(TNN 2011)鐨凢ig. 1. Normalized graph Laplacian learning algorithm鏄緢濂界殑姊害涓嬮檷娉曠殑渚嬪瓙.鍙鐪婩ig1錛屽叾浠栦笉蹇呯湅銆侳ig1闄禨huning鑰佸笀璇句歡 闈炵嚎鎬т紭鍖栫鍏〉絎洓涓猵pt錛屽搴旀暀鏉怭124錛屽叧閿洿綰挎悳绱㈢瓥鐣ワ紝搴旂敤 闈炵嚎鎬т紭鍖栫鍥涢〉絎洓涓猵pt錛屾闀垮姞鍊嶆垨鍑忓嶃傚彧瑕佺洰鏍囧噺灝戝氨鍒頒笅涓涓悳绱㈢偣錛屽茍涓旀闀垮姞鍊嶏紱鍚﹀垯鍋滅暀鍦ㄥ師鐐癸紝灝嗘闀垮噺鍊嶃?br />渚?錛?nbsp;Distance Metric Learning for Large Margin Nearest Neighbor Classification(JLMR),鐩爣鍑芥暟灝辨槸鍏紡14錛屾槸鐭╅樀M鐨勪簩嬈″瀷錛屽睍寮鍚庡氨浼氬彂鐜幫紝鍏充簬M鏄嚎鎬х殑錛屾晠鏄嚫鐨勩傚M姹傚鐨勭粨鏋滐紝闄勫綍鍏紡18鍜?9涔嬮棿鐨勫叕寮忎腑娌℃湁M

鎴戣嚜宸遍澶栫殑鎬濊冿細濡傛灉鏄嚫鍑芥暟錛屽鑷彉閲忔眰鍋忓涓?錛岀劧鍚庡皢鑷彉閲忔眰鍑烘潵涓嶅氨琛屼簡鍢涳紝涓哄暐榪樿姊害涓嬮檷錛熶笂榪頒緥浜屾槸涓嶈鐨勶紝鍥犱負瀵筂姹傚鍚庝笌M鏃犲叧浜嗐傚拰tianyi璁ㄨ錛屾鍥犱負姹傚涓? 娌℃湁瑙f瀽瑙i噰鐢ㄦ搴︿笅闄嶏紝鏈夎В鏋愯В灝辯粨鏉熶簡

http://blog.csdn.net/yudingjun0611/article/details/8147046

1. 姊害涓嬮檷娉?/strong>

姊害涓嬮檷娉曠殑鍘熺悊鍙互鍙傝冿細鏂潶紱忔満鍣ㄥ涔犵涓璁?/a>銆?/span>

鎴戝疄楠屾墍鐢ㄧ殑鏁版嵁鏄?00涓簩緇寸偣銆?/span>

濡傛灉姊害涓嬮檷綆楁硶涓嶈兘姝e父榪愯錛岃冭檻浣跨敤鏇村皬鐨勬闀?涔熷氨鏄涔犵巼)錛岃繖閲岄渶瑕佹敞鎰忎袱鐐癸細

1錛夊浜庤凍澶熷皬鐨?  鑳戒繚璇佸湪姣忎竴姝ラ兘鍑忓皬錛?/span>
2錛変絾鏄鏋滃お灝忥紝姊害涓嬮檷綆楁硶鏀舵暃鐨勪細寰堟參錛?/span>

鎬葷粨錛?/span>
1錛夊鏋滃お灝忥紝灝變細鏀舵暃寰堟參錛?/span>
2錛夊鏋滃お澶э紝灝變笉鑳戒繚璇佹瘡涓嬈¤凱浠i兘鍑忓皬錛屼篃灝變笉鑳戒繚璇佹敹鏁涳紱
濡備綍閫夋嫨-緇忛獙鐨勬柟娉曪細
..., 0.001, 0.003, 0.01, 0.03, 0.1, 0.3, 1...
綰?鍊嶄簬鍓嶄竴涓暟銆?/span>

matlab婧愮爜錛?/span>

  1. function [theta0,theta1]=Gradient_descent(X,Y);  
  2. theta0=0;  
  3. theta1=0;  
  4. t0=0;  
  5. t1=0;  
  6. while(1)  
  7.     for i=1:1:100 %100涓偣  
  8.         t0=t0+(theta0+theta1*X(i,1)-Y(i,1))*1;  
  9.         t1=t1+(theta0+theta1*X(i,1)-Y(i,1))*X(i,1);  
  10.     end  
  11.     old_theta0=theta0;  
  12.     old_theta1=theta1;  
  13.     theta0=theta0-0.000001*t0 %0.000001琛ㄧず瀛︿範鐜?nbsp; 
  14.     theta1=theta1-0.000001*t1  
  15.     t0=0;  
  16.     t1=0;  
  17.     if(sqrt((old_theta0-theta0)^2+(old_theta1-theta1)^2)<0.000001) % 榪欓噷鏄垽鏂敹鏁涚殑鏉′歡錛屽綋鐒跺彲浠ユ湁鍏朵粬鏂規硶鏉ュ仛  
  18.         break;  
  19.     end  
  20. end  


2. 闅忔満姊害涓嬮檷娉?/strong>

闅忔満姊害涓嬮檷娉曢傜敤浜庢牱鏈偣鏁伴噺闈炲父搴炲ぇ鐨勬儏鍐碉紝綆楁硶浣垮緱鎬諱綋鍚戠潃姊害涓嬮檷蹇殑鏂瑰悜涓嬮檷銆?/span>

matlab婧愮爜錛?/span>

  1. function [theta0,theta1]=Gradient_descent_rand(X,Y);  
  2. theta0=0;  
  3. theta1=0;  
  4. t0=theta0;  
  5. t1=theta1;  
  6. for i=1:1:100  
  7.     t0=theta0-0.01*(theta0+theta1*X(i,1)-Y(i,1))*1  
  8.     t1=theta1-0.01*(theta0+theta1*X(i,1)-Y(i,1))*X(i,1)  
  9.     theta0=t0  
  10.     theta1=t1  
  11. end  



鏉板摜 2012-10-19 13:33 鍙戣〃璇勮
]]>
[zz]Newton Raphson綆楁硶http://www.shnenglu.com/guijie/archive/2012/10/16/193347.html鏉板摜鏉板摜Mon, 15 Oct 2012 23:21:00 GMThttp://www.shnenglu.com/guijie/archive/2012/10/16/193347.htmlhttp://www.shnenglu.com/guijie/comments/193347.htmlhttp://www.shnenglu.com/guijie/archive/2012/10/16/193347.html#Feedback0http://www.shnenglu.com/guijie/comments/commentRss/193347.htmlhttp://www.shnenglu.com/guijie/services/trackbacks/193347.htmlhttp://blog.csdn.net/flyingworm_eley/article/details/6517853 

Newton-Raphson綆楁硶鍦ㄧ粺璁′腑騫挎硾搴旂敤浜庢眰瑙LE鐨勫弬鏁頒及璁°?/p>

瀵瑰簲鐨勫崟鍙橀噺濡備笅鍥撅細

 

澶氬厓鍑芥暟綆楁硶錛?/p>

 

 

 

Example錛氾紙implemented in R錛?/p>

#瀹氫箟鍑芥暟f(x)

f=function(x){
    1/x+1/(1-x)
}

#瀹氫箟f_d1涓轟竴闃跺鍑芥暟

f_d1=function(x){
    -1/x^2+1/(x-1)^2
}

#瀹氫箟f_d2涓轟簩闃跺鍑芥暟

f_d2=function(x){
    2/x^3-2/(x-1)^3
}

 

#NR綆楁硶銆
NR=function(time,init){
    X=NULL
    D1=NULL   #鍌ㄥ瓨Xi涓闃跺鍑芥暟鍊?br />D2=NULL   #鍌ㄥ瓨Xi浜岄樁瀵煎嚱鏁板?br />    count=0

    X[1]=init
    l=seq(0.02,0.98,0.0002)
    plot(l,f(l),pch='.')
    points(X[1],f(X[1]),pch=2,col=1)

 

    for (i in 2:time){
        D1[i-1]=f_d1(X[i-1])
        D2[i-1]=f_d2(X[i-1])
        X[i]=X[i-1]-1/(D2[i-1])*(D1[i-1])   #NR綆楁硶榪唬寮?br />        if (abs(D1[i-1])<0.05)break 
        points(X[i],f(X[i]),pch=2,col=i)
        count=count+1
    }
    return(list(x=X,Deriviative_1=D,deriviative2=D2,count))
}


o=NR(30,0.9)

緇撴灉濡備笅鍥撅細鍥句腑涓嶅悓棰滆壊鐨勪笁瑙掑艦琛ㄧずi嬈¤凱浠d駭鐢熺殑浼拌鍊糥i

 

 

o=NR(30,0.9)

 

#鍙﹀彇鍑芥暟f(x)

f=function(x){
    return(exp(3.5*cos(x))+4*sin(x))
}

 

f_d1=function(x){
    return(-3.5*exp(3.5*cos(x))*sin(x)+4*cos(x))
}

 

f_d2=function(x){
    return(-4*sin(x)+3.5^2*exp(3.5*cos(x))*(sin(x))^2-3.5*exp(3.5*cos(x))*cos(x))
}

 

寰楀埌緇撴灉濡備笅錛?/p>

Reference from:

Kevin Quinn

Assistant Professor

Univ Washington



鏉板摜 2012-10-16 07:21 鍙戣〃璇勮
]]>
亚洲AV日韩AV天堂久久| 777久久精品一区二区三区无码| 久久电影网| 亚洲精品国产综合久久一线| 亚洲精品午夜国产va久久| 狠狠色噜噜色狠狠狠综合久久| 亚洲伊人久久精品影院| 国产精品一久久香蕉国产线看| 国产女人aaa级久久久级| 久久精品桃花综合| 麻豆精品久久精品色综合| 亚洲欧洲久久久精品| 精品国际久久久久999波多野| Xx性欧美肥妇精品久久久久久| 久久久久久久免费视频| 2021国产成人精品久久| 久久精品无码午夜福利理论片| 一级做a爱片久久毛片| 中文国产成人精品久久不卡| 久久99精品国产麻豆婷婷| 久久精品天天中文字幕人妻| 亚洲国产一成久久精品国产成人综合| 国内精品久久久久久久97牛牛| 亚洲国产综合久久天堂 | 99久久精品国产综合一区 | 久久精品无码一区二区三区日韩| 久久久久se色偷偷亚洲精品av| 99久久精品免费看国产| 91精品国产高清久久久久久io| 麻豆久久久9性大片| 久久九九免费高清视频| 久久综合综合久久97色| 久久国产成人精品麻豆| 精品国产一区二区三区久久久狼 | 色婷婷综合久久久久中文| 久久精品国产只有精品66| 九九热久久免费视频| 国产AⅤ精品一区二区三区久久| 2021少妇久久久久久久久久| 久久久国产乱子伦精品作者| 久久精品国产AV一区二区三区|