锘??xml version="1.0" encoding="utf-8" standalone="yes"?>亚洲国产一区在线观看,亚洲精品日韩中文字幕久久久,亚洲自偷自偷精品http://m.tkk7.com/ytl-zlq/<font size="4" >鍘氱Н鑰岃杽鍙?--姣忎竴澶╅兘鏄竴涓叏鏂扮殑寮濮?lt;/font>zh-cnSun, 11 May 2025 10:22:57 GMTSun, 11 May 2025 10:22:57 GMT60鏈澶у叕綰︽暟http://m.tkk7.com/ytl-zlq/archive/2013/03/21/396781.htmlytlytlThu, 21 Mar 2013 01:39:00 GMThttp://m.tkk7.com/ytl-zlq/archive/2013/03/21/396781.htmlhttp://m.tkk7.com/ytl-zlq/comments/396781.htmlhttp://m.tkk7.com/ytl-zlq/archive/2013/03/21/396781.html#Feedback0http://m.tkk7.com/ytl-zlq/comments/commentRss/396781.htmlhttp://m.tkk7.com/ytl-zlq/services/trackbacks/396781.html闃呰鍏ㄦ枃

ytl 2013-03-21 09:39 鍙戣〃璇勮
]]>
jvm鐨勫唴瀛樻ā鍨嬩箣eden鍖? 杞澆http://m.tkk7.com/ytl-zlq/archive/2012/03/01/371093.htmlytlytlThu, 01 Mar 2012 10:12:00 GMThttp://m.tkk7.com/ytl-zlq/archive/2012/03/01/371093.htmlhttp://m.tkk7.com/ytl-zlq/comments/371093.htmlhttp://m.tkk7.com/ytl-zlq/archive/2012/03/01/371093.html#Feedback0http://m.tkk7.com/ytl-zlq/comments/commentRss/371093.htmlhttp://m.tkk7.com/ytl-zlq/services/trackbacks/371093.html


嫻呰皥java鍐呭瓨妯″瀷 

       涓嶅悓鐨勫鉤鍙幫紝鍐呭瓨妯″瀷鏄笉涓鏍風殑錛屼絾鏄痡vm鐨勫唴瀛樻ā鍨嬭鑼冩槸緇熶竴鐨勩傚叾瀹瀓ava鐨勫綰跨▼騫跺彂闂鏈緇堥兘浼氬弽鏄犲湪java鐨勫唴瀛樻ā鍨嬩笂錛屾墍璋撶嚎紼嬪畨鍏ㄦ棤 闈炴槸瑕佹帶鍒跺涓嚎紼嬪鏌愪釜璧勬簮鐨勬湁搴忚闂垨淇敼銆傛葷粨java鐨勫唴瀛樻ā鍨嬶紝瑕佽В鍐充袱涓富瑕佺殑闂錛氬彲瑙佹у拰鏈夊簭鎬с傛垜浠兘鐭ラ亾璁$畻鏈烘湁楂橀熺紦瀛樼殑瀛樺湪錛屽 鐞嗗櫒騫朵笉鏄瘡嬈″鐞嗘暟鎹兘鏄彇鍐呭瓨鐨勩侸VM瀹氫箟浜嗚嚜宸辯殑鍐呭瓨妯″瀷錛屽睆钄戒簡搴曞眰騫沖彴鍐呭瓨綆$悊緇嗚妭錛屽浜巎ava寮鍙戜漢鍛橈紝瑕佹竻妤氬湪jvm鍐呭瓨妯″瀷鐨勫熀紜 涓婏紝濡傛灉瑙e喅澶氱嚎紼嬬殑鍙鎬у拰鏈夊簭鎬с?br />       閭d箞錛屼綍璋?strong>鍙鎬?/strong>錛?澶氫釜綰跨▼涔嬮棿鏄笉鑳戒簰鐩鎬紶閫掓暟鎹氫俊鐨勶紝瀹冧滑涔嬮棿鐨勬矡閫氬彧鑳介氳繃鍏變韓鍙橀噺鏉ヨ繘琛屻侸ava鍐呭瓨妯″瀷錛圝MM錛夎瀹氫簡jvm鏈変富鍐呭瓨錛屼富鍐呭瓨鏄涓嚎紼嬪叡浜?鐨勩傚綋new涓涓璞$殑鏃跺欙紝涔熸槸琚垎閰嶅湪涓誨唴瀛樹腑錛屾瘡涓嚎紼嬮兘鏈夎嚜宸辯殑宸ヤ綔鍐呭瓨錛屽伐浣滃唴瀛樺瓨鍌ㄤ簡涓誨瓨鐨勬煇浜涘璞$殑鍓湰錛屽綋鐒剁嚎紼嬬殑宸ヤ綔鍐呭瓨澶у皬鏄湁闄愬埗 鐨勩傚綋綰跨▼鎿嶄綔鏌愪釜瀵硅薄鏃訛紝鎵ц欏哄簭濡備笅錛?br /> (1) 浠庝富瀛樺鍒跺彉閲忓埌褰撳墠宸ヤ綔鍐呭瓨 (read and load)
 (2) 鎵ц浠g爜錛屾敼鍙樺叡浜彉閲忓?(use and assign)
 (3) 鐢ㄥ伐浣滃唴瀛樻暟鎹埛鏂頒富瀛樼浉鍏沖唴瀹?(store and write)

JVM瑙勮寖瀹氫箟浜嗙嚎紼嬪涓誨瓨鐨勬搷浣滄寚 浠わ細read錛宭oad錛寀se錛宎ssign錛宻tore錛寃rite銆傚綋涓涓叡浜彉閲忓湪澶氫釜綰跨▼鐨勫伐浣滃唴瀛樹腑閮芥湁鍓湰鏃訛紝濡傛灉涓涓嚎紼嬩慨鏀逛簡榪欎釜鍏變韓 鍙橀噺錛岄偅涔堝叾浠栫嚎紼嬪簲璇ヨ兘澶熺湅鍒拌繖涓淇敼鍚庣殑鍊鹼紝榪欏氨鏄綰跨▼鐨勫彲瑙佹ч棶棰樸?br />        閭d箞錛屼粈涔堟槸鏈夊簭鎬?/strong>鍛?錛熺嚎紼嬪湪寮曠敤鍙橀噺鏃朵笉鑳界洿鎺ヤ粠涓誨唴瀛樹腑寮曠敤,濡傛灉綰跨▼宸ヤ綔鍐呭瓨涓病鏈夎鍙橀噺,鍒欎細浠庝富鍐呭瓨涓嫹璐濅竴涓壇鏈埌宸ヤ綔鍐呭瓨涓?榪欎釜榪囩▼涓簉ead-load,瀹?鎴愬悗綰跨▼浼氬紩鐢ㄨ鍓湰銆傚綋鍚屼竴綰跨▼鍐嶅害寮曠敤璇ュ瓧孌墊椂,鏈夊彲鑳介噸鏂頒粠涓誨瓨涓幏鍙栧彉閲忓壇鏈?read-load-use),涔熸湁鍙兘鐩存帴寮曠敤鍘熸潵鐨勫壇鏈?(use),涔熷氨鏄 read,load,use欏哄簭鍙互鐢盝VM瀹炵幇緋葷粺鍐沖畾銆?br />        綰跨▼涓嶈兘鐩存帴涓轟富瀛樹腑涓瓧孌佃祴鍊鹼紝瀹冧細灝嗗兼寚瀹氱粰宸ヤ綔鍐呭瓨涓殑鍙橀噺鍓湰(assign),瀹屾垚鍚庤繖涓彉閲忓壇鏈細鍚屾鍒頒富瀛樺偍鍖?store- write)錛岃嚦浜庝綍鏃跺悓姝ヨ繃鍘伙紝鏍規嵁JVM瀹炵幇緋葷粺鍐沖畾.鏈夎瀛楁,鍒欎細浠庝富鍐呭瓨涓皢璇ュ瓧孌佃祴鍊煎埌宸ヤ綔鍐呭瓨涓?榪欎釜榪囩▼涓簉ead-load,瀹屾垚鍚庣嚎 紼嬩細寮曠敤璇ュ彉閲忓壇鏈紝褰撳悓涓綰跨▼澶氭閲嶅瀵瑰瓧孌佃祴鍊兼椂,姣斿錛?br />Java浠g爜 
for(int i=0;i<10;i++)   
 a++;  
綰跨▼鏈夊彲鑳藉彧瀵瑰伐浣滃唴瀛樹腑鐨勫壇鏈繘琛岃祴鍊?鍙埌鏈鍚庝竴嬈¤祴鍊煎悗鎵嶅悓姝ュ埌涓誨瓨鍌ㄥ尯錛屾墍浠ssign,store,weite欏哄簭鍙互鐢盝VM瀹炵幇緋葷粺鍐?瀹氥傚亣璁炬湁涓涓叡浜彉閲弜錛岀嚎紼媋鎵цx=x+1銆備粠涓婇潰鐨勬弿榪頒腑鍙互鐭ラ亾x=x+1騫朵笉鏄竴涓師瀛愭搷浣滐紝瀹冪殑鎵ц榪囩▼濡備笅錛?br />1 浠庝富瀛樹腑璇誨彇鍙橀噺x鍓湰鍒板伐浣滃唴瀛?br />2 緇檟鍔?
3 灝唜鍔?鍚庣殑鍊煎啓鍥炰富 瀛?br />濡傛灉鍙﹀涓涓嚎紼媌鎵цx=x-1錛屾墽琛岃繃紼嬪涓嬶細
1 浠庝富瀛樹腑璇誨彇鍙橀噺x鍓湰鍒板伐浣滃唴瀛?br />2 緇檟鍑?
3 灝唜鍑?鍚庣殑鍊煎啓鍥炰富瀛?nbsp;
閭d箞鏄劇劧錛屾渶緇堢殑x鐨勫兼槸涓嶅彲闈犵殑銆傚亣璁緓鐜板湪涓?0錛岀嚎紼媋鍔?錛岀嚎紼媌鍑?錛屼粠琛ㄩ潰涓婄湅錛屼技涔庢渶緇坸榪樻槸涓?0錛屼絾鏄綰跨▼鎯呭喌涓嬩細鏈夎繖縐嶆儏鍐靛彂鐢燂細
1錛氱嚎紼媋浠庝富瀛樿鍙杧鍓湰鍒板伐浣滃唴瀛橈紝宸ヤ綔鍐呭瓨涓瓁鍊間負10
2錛氱嚎紼媌浠庝富瀛樿鍙杧鍓湰鍒板伐浣滃唴瀛橈紝宸ヤ綔鍐呭瓨涓瓁鍊間負10
3錛氱嚎紼媋灝嗗伐浣滃唴瀛樹腑x鍔?錛屽伐浣滃唴瀛樹腑x鍊間負11
4錛氱嚎紼媋灝唜鎻愪氦涓誨瓨涓紝涓誨瓨涓瓁涓?1
5錛氱嚎紼媌灝嗗伐浣滃唴瀛樹腑x鍊煎噺1錛屽伐浣滃唴瀛樹腑x鍊間負9
6錛氱嚎紼媌灝唜鎻愪氦鍒頒腑涓誨瓨涓紝涓誨瓨涓瓁涓?/p>

 

jvm鐨勫唴瀛樻ā鍨嬩箣eden鍖?/strong>

鎵璋撶嚎紼嬬殑“宸ヤ綔鍐呭瓨”鍒板簳鏄釜浠涔堜笢瑗匡紵鏈夌殑浜鴻涓烘槸綰跨▼鐨勬爤錛屽叾瀹炶繖縐嶇悊瑙f槸涓嶆紜殑銆傜湅鐪婮LS錛坖ava璇█瑙勮寖錛夊綰跨▼宸ヤ綔 鍐呭瓨鐨勬弿榪幫紝綰跨▼鐨剋orking memory鍙槸cpu鐨?strong>瀵勫瓨鍣ㄥ拰楂橀熺紦瀛樼殑鎶借薄鎻忚堪銆?/p>

      鍙兘 寰堝浜洪兘瑙夊緱鑾悕鍏跺錛岃JVM鐨勫唴瀛樻ā鍨嬶紝鎬庝箞浼氭壇鍒癱pu涓婂幓鍛紵鍦ㄦ錛屾垜璁や負寰堟湁蹇呰闃愯堪涓嬶紝鍏?寰楀緢澶氫漢鐪嬪緱涓嶆槑涓嶇櫧鐨勩傚厛鎶涘紑java铏氭嫙鏈轟笉璋堬紝鎴戜滑閮界煡閬擄紝鐜板湪鐨勮綆楁満錛宑pu鍦ㄨ綆楃殑鏃跺欙紝騫朵笉鎬繪槸浠庡唴瀛樿鍙栨暟鎹紝瀹冪殑鏁版嵁璇誨彇欏哄簭浼樺厛綰?鏄細瀵勫瓨鍣紞楂橀熺紦瀛橈紞鍐呭瓨銆傜嚎紼嬭楄垂鐨勬槸CPU錛岀嚎紼嬭綆楃殑鏃跺欙紝鍘熷鐨勬暟鎹潵鑷唴瀛橈紝鍦ㄨ綆楄繃紼嬩腑錛屾湁浜涙暟鎹彲鑳借棰戠箒璇誨彇錛岃繖浜涙暟鎹瀛樺偍鍦ㄥ瘎瀛樺櫒 鍜岄珮閫熺紦瀛樹腑錛屽綋綰跨▼璁$畻瀹屽悗錛岃繖浜涚紦瀛樼殑鏁版嵁鍦ㄩ傚綋鐨勬椂鍊欏簲璇ュ啓鍥炲唴瀛樸傚綋涓涓嚎紼嬪悓鏃惰鍐欐煇涓唴瀛樻暟鎹椂錛屽氨浼氫駭鐢熷綰跨▼騫跺彂闂錛屾秹鍙婂埌涓変釜鐗?鎬э細鍘熷瓙鎬э紝鏈夊簭鎬э紝鍙鎬с傚湪銆婄嚎紼嬪畨鍏ㄦ葷粨銆嬭繖綃囨枃绔犱腑錛屼負浜嗙悊瑙f柟渚匡紝鎴戞妸鍘熷瓙鎬у拰鏈夊簭鎬х粺涓鍙仛“澶氱嚎紼嬫墽琛屾湁搴忔?#8221;銆傛敮鎸佸綰跨▼鐨勫鉤鍙伴兘浼氶潰涓?榪欑闂錛岃繍琛屽湪澶氱嚎紼嬪鉤鍙頒笂鏀寔澶氱嚎紼嬬殑璇█搴旇鎻愪緵瑙e喅璇ラ棶棰樼殑鏂規銆?/p>

      synchronized, volatile,閿佹満鍒訛紙濡傚悓姝ュ潡錛屽氨緇槦 鍒楋紝闃誨闃熷垪錛?/strong>絳夌瓑銆傝繖浜涙柟妗堝彧鏄娉曞眰闈㈢殑錛屼絾鎴戜滑瑕佷粠鏈川涓婂幓鐞嗚В瀹冿紝涓嶈兘浠呬粎鐭ラ亾涓涓?synchronized 鍙互淇濊瘉鍚屾灝卞畬浜嗐?nbsp;  鍦ㄨ繖閲屾垜璇寸殑鏄痡vm鐨勫唴瀛樻ā鍨嬶紝鏄姩鎬佺殑錛岄潰鍚戝綰跨▼騫跺彂鐨勶紝娌胯JSL鐨?#8220;working memory”鐨勮娉曪紝鍙槸涓嶆兂鐗墊壇鍒板お澶氬簳灞傜粏鑺傦紝鍥犱負 銆婄嚎紼嬪畨鍏ㄦ葷粨銆嬭繖綃囨枃绔犳剰鍦ㄨ鏄庢庢牱浠庤娉曞眰闈㈠幓鐞嗚Вjava鐨勭嚎紼嬪悓姝ワ紝鐭ラ亾鍚勪釜鍏抽敭瀛楃殑浣跨敤鍦?鏅?/p>

璇磋JVM鐨別den鍖哄惂銆侸VM鐨勫唴瀛橈紝琚垝鍒嗕簡寰堝鐨勫尯鍩燂細

1.紼嬪簭璁℃暟鍣?br />姣忎竴涓狫ava綰跨▼閮芥湁涓涓▼搴忚鏁板櫒鏉ョ敤浜庝繚瀛樼▼搴忔墽琛屽埌褰撳墠鏂規硶鐨勫摢涓涓寚浠ゃ?br />2.綰跨▼鏍?br />綰跨▼鐨勬瘡涓柟娉曡鎵ц鐨勬椂鍊欙紝閮戒細鍚屾椂鍒涘緩涓涓撫錛團rame錛夌敤浜庡瓨鍌ㄦ湰鍦板彉閲忚〃銆佹搷浣滄爤銆佸姩鎬侀摼鎺ャ佹柟娉曞嚭鍏ュ彛絳変俊鎭傛瘡涓涓柟娉曠殑璋冪敤鑷沖畬鎴愶紝灝辨剰鍛崇潃涓涓撫鍦╒M鏍堜腑鐨勫叆鏍堣嚦鍑烘爤鐨勮繃紼嬨傚鏋滅嚎紼嬭姹傜殑鏍堟繁搴﹀ぇ浜庤櫄鎷熸満鎵鍏佽鐨勬繁搴︼紝灝嗘姏鍑篠tackOverflowError寮傚父錛涘鏋淰M鏍堝彲浠ュ姩鎬佹墿灞曪紙VM Spec涓厑璁稿浐瀹氶暱搴︾殑VM鏍堬級錛屽綋鎵╁睍鏃舵棤娉曠敵璇峰埌瓚沖鍐呭瓨鍒欐姏鍑篛utOfMemoryError寮傚父銆?br />3.鏈湴鏂規硶鏍?br />4.鍫?br />姣忎釜綰跨▼鐨勬爤閮芥槸璇ョ嚎紼嬬鏈夌殑錛屽爢鍒欐槸鎵鏈夌嚎紼嬪叡浜殑銆傚綋鎴戜滑new涓涓璞℃椂錛岃瀵硅薄灝辮鍒嗛厤鍒頒簡鍫嗕腑銆備絾鏄爢錛屽茍涓嶆槸涓涓畝鍗曠殑姒傚康錛屽爢鍖哄張鍒掑垎浜嗗緢澶氬尯鍩燂紝涓轟粈涔堝爢鍒掑垎鎴愯繖涔堝鍖哄煙錛岃繖鏄負浜咼VM鐨勫唴瀛樺瀮鍦炬敹闆嗭紝浼間箮瓚婃壇瓚婅繙浜嗭紝鎵埌鍨冨溇鏀墮泦浜嗭紝鐜板湪鐨刯vm鐨刧c閮芥槸鎸変唬鏀墮泦錛屽爢鍖哄ぇ鑷磋鍒嗕負涓夊ぇ鍧楋細鏂扮敓浠o紝鏃х敓浠o紝鎸佷箙浠o紙铏氭嫙鐨勶級錛涙柊鐢熶唬鍙堝垎涓篹den鍖猴紝s0鍖猴紝s1鍖恒傛柊寤轟竴涓璞℃椂錛屽熀鏈皬鐨勫璞★紝鐢熷懡鍛ㄦ湡鐭殑瀵硅薄閮戒細鏀懼湪鏂扮敓浠g殑eden鍖轟腑錛宔den鍖烘弧鏃訛紝鏈変竴涓皬鑼冨洿鐨刧c錛坢inor gc錛夛紝鏁翠釜鏂扮敓浠f弧鏃訛紝浼氭湁涓涓ぇ鑼冨洿鐨刧c錛坢ajor gc錛夛紝灝嗘柊鐢熶唬閲岀殑閮ㄥ垎瀵硅薄杞埌鏃х敓浠i噷銆?br />5.鏂規硶鍖?nbsp;
鍏跺疄灝辨槸姘鎬箙浠o紙Permanent Generation錛夛紝鏂規硶鍖轟腑瀛樻斁浜嗘瘡涓狢lass鐨勭粨鏋勪俊鎭紝鍖呮嫭甯擱噺姹犮佸瓧孌墊弿榪般佹柟娉曟弿榪扮瓑絳夈俈M Space鎻忚堪涓榪欎釜鍖哄煙鐨勯檺鍒墮潪甯稿鏉撅紝闄や簡鍜孞ava鍫嗕竴鏍蜂笉闇瑕佽繛緇殑鍐呭瓨錛屼篃鍙互閫夋嫨鍥哄畾澶у皬鎴栬呭彲鎵╁睍澶栵紝鐢氳嚦鍙互閫夋嫨涓嶅疄鐜板瀮鍦炬敹闆嗐傜浉瀵規潵璇達紝鍨冨溇鏀墮泦琛屼負鍦ㄨ繖涓尯鍩熸槸鐩稿姣旇緝灝戝彂鐢熺殑錛屼絾騫朵笉鏄煇浜涙弿榪伴偅鏍鋒案涔呬唬涓嶄細鍙戠敓GC錛堣嚦 灝戝褰撳墠涓繪祦鐨勫晢涓欽VM瀹炵幇鏉ヨ鏄姝わ級錛岃繖閲岀殑GC涓昏鏄甯擱噺姹犵殑鍥炴敹鍜屽綾葷殑鍗歌澆錛岃櫧鐒跺洖鏀剁殑“鎴愮嘩”涓鑸篃姣旇緝宸己浜烘剰錛屽挨鍏舵槸綾誨嵏杞斤紝鏉′歡鐩稿綋鑻涘埢銆?br />6.甯擱噺姹?br /> Class鏂囦歡涓櫎浜嗘湁綾葷殑鐗堟湰銆佸瓧孌點佹柟娉曘佹帴鍙g瓑鎻忚堪絳変俊鎭錛岃繕鏈変竴欏逛俊鎭槸甯擱噺琛?constant_pool table)錛岀敤浜庡瓨鏀劇紪璇戞湡宸插彲鐭ョ殑甯擱噺錛岃繖閮ㄥ垎鍐呭灝嗗湪綾誨姞杞藉悗榪涘叆鏂規硶鍖猴紙姘鎬箙浠o級瀛樻斁銆備絾鏄疛ava璇█騫朵笉瑕佹眰甯擱噺涓瀹氬彧鏈夌紪璇戞湡棰勭疆鍏lass鐨勫父閲忚〃鐨勫唴瀹規墠鑳借繘鍏ユ柟娉曞尯甯擱噺姹狅紝榪愯鏈熼棿涔熷彲灝嗘柊鍐呭鏀懼叆甯擱噺姹狅紙鏈鍏稿瀷鐨凷tring.intern()鏂規硶錛夈?/p>



ytl 2012-03-01 18:12 鍙戣〃璇勮
]]>
Java 鍘熺爜浠g爜瀛︿範http://m.tkk7.com/ytl-zlq/archive/2011/09/24/359414.htmlytlytlSat, 24 Sep 2011 07:30:00 GMThttp://m.tkk7.com/ytl-zlq/archive/2011/09/24/359414.html       鍏充簬Java涓殑transient錛寁olatile鍜宻trictfp鍏抽敭瀛?nbsp;http://www.iteye.com/topic/52957
       (1), ArrayList搴曞眰浣跨敤Object鏁版嵁瀹炵幇錛?nbsp;private transient Object[] elementData;涓斿湪浣跨敤涓嶅甫鍙傛暟鐨勬柟寮忓疄渚嬪寲鏃訛紝鐢熸垚鏁扮粍榛樿鐨勯暱搴︽槸10銆?br />      (2),  add鏂規硶瀹炵幇
      public boolean add(E e) {
           //ensureCapacityInternal鍒ゆ柇娣誨姞鏂板厓绱犳槸鍚﹂渶瑕侀噸鏂版墿澶ф暟緇勭殑闀垮害錛岄渶瑕佸垯鎵╁惁鍒欎笉
          ensureCapacityInternal(size + 1);  // 姝や負JDK7璋冪敤鐨勬柟娉?JDK5閲岄潰浣跨敤鐨別nsureCapacity鏂規硶
          elementData[size++] = e; //鎶婂璞℃彃鍏ユ暟緇勶紝鍚屾椂鎶婃暟緇勫瓨鍌ㄧ殑鏁版嵁闀垮害size鍔?
          return true;
      }
     JDK 7涓?nbsp;ensureCapacityInternal瀹炵幇
   private void ensureCapacityInternal(int minCapacity) {
        modCount++;淇敼嬈℃暟
        // overflow-conscious code
        if (minCapacity - elementData.length > 0)
            grow(minCapacity);//濡傛灉闇瑕佹墿澶ф暟緇勯暱搴?/div>
    }
/**
     * The maximum size of array to allocate. --鐢寵鏂版暟緇勬渶澶ч暱搴?/div>
     * Some VMs reserve some header words in an array.
     * Attempts to allocate larger arrays may result in
     * OutOfMemoryError: Requested array size exceeds VM limit  --濡傛灉鐢寵鐨勬暟緇勫崰鐢ㄧ殑鍐呭績澶т簬JVM鐨勯檺鍒舵姏鍑哄紓甯?/div>
     */
    private static final int MAX_ARRAY_SIZE = Integer.MAX_VALUE - 8;//涓轟粈涔堝噺鍘?鐪嬫敞閲婄2琛?/div>
    /**
     * Increases the capacity to ensure that it can hold at least the
     * number of elements specified by the minimum capacity argument.
     *
     * @param minCapacity the desired minimum capacity
     */
    private void grow(int minCapacity) {
        // overflow-conscious code
        int oldCapacity = elementData.length;
        int newCapacity = oldCapacity + (oldCapacity >> 1); //鏂扮敵璇風殑闀垮害涓簅ld鐨?/2鍊嶅悓鏃朵嬌鐢ㄤ綅縐昏繍綆楁洿楂樻晥錛孞DK5涓細 (oldCapacity *3)/2+1
        if (newCapacity - minCapacity < 0)  
            newCapacity = minCapacity; 
        if (newCapacity - MAX_ARRAY_SIZE > 0) //浣犳噦鐨?/div>
            newCapacity = hugeCapacity(minCapacity);
        // minCapacity is usually close to size, so this is a win:
        elementData = Arrays.copyOf(elementData, newCapacity);
    }
 //鍙互鐢寵鐨勬渶澶ч暱搴?/div>
    private static int hugeCapacity(int minCapacity) { 
        if (minCapacity < 0) // overflow
            throw new OutOfMemoryError();
        return (minCapacity > MAX_ARRAY_SIZE) ?
            Integer.MAX_VALUE :
            MAX_ARRAY_SIZE;
    }





ytl 2011-09-24 15:30 鍙戣〃璇勮
]]>璁¢噺鐐廣佽閲忓垎綾葷瓑http://m.tkk7.com/ytl-zlq/archive/2011/08/07/355934.htmlytlytlSun, 07 Aug 2011 03:02:00 GMThttp://m.tkk7.com/ytl-zlq/archive/2011/08/07/355934.html闃呰鍏ㄦ枃

ytl 2011-08-07 11:02 鍙戣〃璇勮
]]>
Alogrithms to quicksorthttp://m.tkk7.com/ytl-zlq/archive/2011/05/08/349777.htmlytlytlSun, 08 May 2011 06:13:00 GMThttp://m.tkk7.com/ytl-zlq/archive/2011/05/08/349777.html

Quicksort

Quicksort is a fast sorting algorithm, which is used not only for educational purposes, but widely applied in practice. On the average, it has O(n log n) complexity, making quicksort suitable for sorting big data volumes. The idea of the algorithm is quite simple and once you realize it, you can write quicksort as fast as bubble sort.

Algorithm

The divide-and-conquer strategy is used in quicksort. Below the recursion step is described:
  1. Choose a pivot value. We take the value of the middle element as pivot value, but it can be any value, which is in range of sorted values, even if it doesn't present in the array.
  2. Partition. Rearrange elements in such a way, that all elements which are lesser than the pivot go to the left part of the array and all elements greater than the pivot, go to the right part of the array. Values equal to the pivot can stay in any part of the array. Notice, that array may be divided in non-equal parts.
  3. Sort both parts. Apply quicksort algorithm recursively to the left and the right parts.

Partition algorithm in detail

There are two indices i and j and at the very beginning of the partition algorithm i points to the first element in the array andj points to the last one. Then algorithm moves i forward, until an element with value greater or equal to the pivot is found. Index j is moved backward, until an element with value lesser or equal to the pivot is found. If i ≤ j then they are swapped and i steps to the next position (i + 1), j steps to the previous one (j - 1). Algorithm stops, when i becomes greater than j.

After partition, all values before i-th element are less or equal than the pivot and all values after j-th element are greater or equal to the pivot.

Example. Sort {1, 12, 5, 26, 7, 14, 3, 7, 2} using quicksort.

Quicksort example

Notice, that we show here only the first recursion step, in order not to make example too long. But, in fact, {1, 2, 5, 7, 3} and {14, 7, 26, 12} are sorted then recursively.

Why does it work?

On the partition step algorithm divides the array into two parts and every element a from the left part is less or equal than every element b from the right part. Also a and b satisfy a ≤ pivot ≤ b inequality. After completion of the recursion calls both of the parts become sorted and, taking into account arguments stated above, the whole array is sorted.

Complexity analysis

On the average quicksort has O(n log n) complexity, but strong proof of this fact is not trivial and not presented here. Still, you can find the proof in [1]. In worst case, quicksort runs O(n2) time, but on the most "practical" data it works just fine and outperforms other O(n log n) sorting algorithms.

Code snippets

Java

int partition(int arr[], int left, int right)

{

     int i = left;

     int j = right;

     int temp;

    int  pivot = arr[(left+right)>>1];

     while(i<=j){

        while(arr[i]>=pivot){

            i++;

        }

        while(arr[j]<=pivot){

            j--;

       }

       if(i<=j){

           temp = arr[i];

           arr[i] = arr[j];

           arr[j] = temp;

           i++;

           j--;

       }

    }

    return i

}

 

void quickSort(int arr[], int left, int right) {

      int index = partition(arr, left, right);

      if(left<index-1){

         quickSort(arr,left,index-1);

      }

      if(index<right){

         quickSort(arr,index,right); 

      }

}

python

def quickSort(L,left,right) {

      i = left

      j = right

      if right-left <=1:

            return L

      pivot = L[(left + right) >>1];

      /* partition */

      while (i <= j) {

            while (L[i] < pivot)

                  i++;

            while (L[j] > pivot)

                  j--;

            if (i <= j) {

                  L[i],L[j] = L[j],L[i]

                  i++;

                  j--;

            }

      };

      /* recursion */

      if (left < j)

            quickSort(Lleftj);

      if (i < right)

            quickSort(Liright);

}



ytl 2011-05-08 14:13 鍙戣〃璇勮
]]>
Algorithms to Insertion Sorthttp://m.tkk7.com/ytl-zlq/archive/2011/05/08/349773.htmlytlytlSun, 08 May 2011 04:24:00 GMThttp://m.tkk7.com/ytl-zlq/archive/2011/05/08/349773.html

Insertion Sort

Insertion sort belongs to the O(n2) sorting algorithms. Unlike many sorting algorithms with quadratic complexity, it is actually applied in practice for sorting small arrays of data. For instance, it is used to improve quicksort routine. Some sources notice, that people use same algorithm ordering items, for example, hand of cards.

Algorithm

Insertion sort algorithm somewhat resembles selection sort. Array is imaginary divided into two parts - sorted one andunsorted one. At the beginning, sorted part contains first element of the array and unsorted one contains the rest. At every step, algorithm takes first element in the unsorted part and inserts it to the right place of the sorted one. Whenunsorted part becomes empty, algorithm stops. Sketchy, insertion sort algorithm step looks like this:

Insertion sort sketchy, before insertion

becomes

Insertion sort sketchy, after insertion

The idea of the sketch was originaly posted here.

Let us see an example of insertion sort routine to make the idea of algorithm clearer.

Example. Sort {7, -5, 2, 16, 4} using insertion sort.

Insertion sort example

The ideas of insertion

The main operation of the algorithm is insertion. The task is to insert a value into the sorted part of the array. Let us see the variants of how we can do it.

"Sifting down" using swaps

The simplest way to insert next element into the sorted part is to sift it down, until it occupies correct position. Initially the element stays right after the sorted part. At each step algorithm compares the element with one before it and, if they stay in reversed order, swap them. Let us see an illustration.

insertion sort, sift down illustration

This approach writes sifted element to temporary position many times. Next implementation eliminates those unnecessary writes.

Shifting instead of swapping

We can modify previous algorithm, so it will write sifted element only to the final correct position. Let us see an illustration.

insertion sort, shifting illustration

It is the most commonly used modification of the insertion sort.

Using binary search

It is reasonable to use binary search algorithm to find a proper place for insertion. This variant of the insertion sort is calledbinary insertion sort. After position for insertion is found, algorithm shifts the part of the array and inserts the element. This version has lower number of comparisons, but overall average complexity remains O(n2). From a practical point of view this improvement is not very important, because insertion sort is used on quite small data sets.

Complexity analysis

Insertion sort's overall complexity is O(n2) on average, regardless of the method of insertion. On the almost sorted arrays insertion sort shows better performance, up to O(n) in case of applying insertion sort to a sorted array. Number of writes is O(n2) on average, but number of comparisons may vary depending on the insertion algorithm. It is O(n2) when shifting or swapping methods are used and O(n log n) for binary insertion sort.

From the point of view of practical application, an average complexity of the insertion sort is not so important. As it was mentioned above, insertion sort is applied to quite small data sets (from 8 to 12 elements). Therefore, first of all, a "practical performance" should be considered. In practice insertion sort outperforms most of the quadratic sorting algorithms, like selection sort or bubble sort.

Insertion sort properties

  • adaptive (performance adapts to the initial order of elements);
  • stable (insertion sort retains relative order of the same elements);
  • in-place (requires constant amount of additional space);
  • online (new elements can be added during the sort).

Code snippets

We show the idea of insertion with shifts in Java implementation and the idea of insertion using python code snippet.

Java implementation

void insertionSort(int[] arr) {

      int i,j,newValue;

      for(i=1;i<arr.length;i++){

           newValue = arr[i];

           j=i;

           while(j>0&&arr[j-1]>newValue){

               arr[j] = arr[j-1];

               j--;

           }

           arr[j] = newValue;

}

Python implementation

void insertionSort(L) {

      for i in range(l,len(L)):

            j = i

            newValue = L[i]

            while j > 0 and  L[j - 1] >L[j]:

                 L[j] = L[j - 1]

                  j = j-1

            }

            L[j] = newValue

      }

}



ytl 2011-05-08 12:24 鍙戣〃璇勮
]]>
Binary search algorithmhttp://m.tkk7.com/ytl-zlq/archive/2011/05/06/349702.htmlytlytlFri, 06 May 2011 10:11:00 GMThttp://m.tkk7.com/ytl-zlq/archive/2011/05/06/349702.html

Binary search algorithm

Generally, to find a value in unsorted array, we should look through elements of an array one by one, until searched value is found. In case of searched value is absent from array, we go through all elements. In average, complexity of such an algorithm is proportional to the length of the array.

Situation changes significantly, when array is sorted. If we know it, random access capability can be utilized very efficientlyto find searched value quick. Cost of searching algorithm reduces to binary logarithm of the array length. For reference, log2(1 000 000) 鈮?20. It means, that in worst case, algorithm makes 20 steps to find a value in sorted array of a million elements or to say, that it doesn't present it the array.

Algorithm

Algorithm is quite simple. It can be done either recursively or iteratively:

  1. get the middle element;
  2. if the middle element equals to the searched value, the algorithm stops;
  3. otherwise, two cases are possible:
    • searched value is less, than the middle element. In this case, go to the step 1 for the part of the array, before middle element.
    • searched value is greater, than the middle element. In this case, go to the step 1 for the part of the array, after middle element.
Now we should define, when iterations should stop. First case is when searched element is found. Second one is when subarray has no elements. In this case, we can conclude, that searched value doesn't present in the array.

Examples

Example 1. Find 6 in {-1, 5, 6, 18, 19, 25, 46, 78, 102, 114}.

Step 1 (middle element is 19 > 6):     -1  5  6  18  19  25  46  78  102  114

Step 2 (middle element is 5 < 6):      -1  5  6  18  19  25  46  78  102  114

Step 3 (middle element is 6 == 6):     -1  5  6  18  19  25  46  78  102  114

Example 2. Find 103 in {-1, 5, 6, 18, 19, 25, 46, 78, 102, 114}.

Step 1 (middle element is 19 < 103):   -1  5  6  18  19  25  46  78  102  114

Step 2 (middle element is 78 < 103):   -1  5  6  18  19  25  46  78  102  114

Step 3 (middle element is 102 < 103):  -1  5  6  18  19  25  46  78  102  114

Step 4 (middle element is 114 > 103):  -1  5  6  18  19  25  46  78  102  114

Step 5 (searched value is absent):     -1  5  6  18  19  25  46  78  102  114

Complexity analysis

Huge advantage of this algorithm is that it's complexity depends on the array size logarithmically in worst case. In practice it means, that algorithm will do at most log2(n) iterations, which is a very small number even for big arrays. It can be proved very easily. Indeed, on every step the size of the searched part is reduced by half. Algorithm stops, when there are no elements to search in. Therefore, solving following inequality in whole numbers:

n / 2iterations > 0

resulting in

iterations <= log2(n).

It means, that binary search algorithm time complexity is O(log2(n)).

Code snippets.

You can see recursive solution for Java and iterative for python below.

Java

int binarySearch(int[] array, int value, int left, int right) {

      if (left > right)

            return -1;

      int middle = left + (right-left) / 2;

      if (array[middle] == value)

            return middle;

      if (array[middle] > value)

            return binarySearch(array, value, left, middle - 1);

      else

            return binarySearch(array, value, middle + 1, right);           

}

Python

def biSearch(L,e,first,last):

      if last - first < 2: return L[first] == e or L[last] == e

      mid = first + (last-first)/2

      if L[mid] ==e: return True

      if L[mid]> e : 

            return biSearch(L,e,first,mid-1)

      return biSearch(L,e,mid+1,last)

      



ytl 2011-05-06 18:11 鍙戣〃璇勮
]]>
Algorithm to merge sortehttp://m.tkk7.com/ytl-zlq/archive/2011/05/06/349695.htmlytlytlFri, 06 May 2011 09:05:00 GMThttp://m.tkk7.com/ytl-zlq/archive/2011/05/06/349695.htmlMerge sort is an O(n log ncomparison-based sorting algorithm. Most implementations produce a stable sort, meaning that the implementation preserves the input order of equal elements in the sorted output. It is a divide and conquer algorithm. Merge sort was invented by John von Neumann in 1945. A detailed description and analysis of bottom-up mergesort appeared in a report byGoldstine and Neumann as early as 1948
 divide and conquer algorithm: 1, split the problem into several subproblem of the same type. 2,solove independetly. 3 combine those solutions



Python Implement
  
  def mergeSort(L):
       
         if len(L) < 2 :
               return  L
         middle = len(L)/2
         left = mergeSort(L[:mddle])
         right = mergeSort(L[middle:])
         together = merge(left,right)
         return together


ytl 2011-05-06 17:05 鍙戣〃璇勮
]]>
Algorithm to merge sorted arrayshttp://m.tkk7.com/ytl-zlq/archive/2011/05/06/349692.htmlytlytlFri, 06 May 2011 08:55:00 GMThttp://m.tkk7.com/ytl-zlq/archive/2011/05/06/349692.html

Algorithm to merge sorted arrays

In the article we present an algorithm for merging two sorted arrays. One can learn how to operate with several arrays and master read/write indices. Also, the algorithm has certain applications in practice, for instance in merge sort.

Merge algorithm

Assume, that both arrays are sorted in ascending order and we want resulting array to maintain the same order. Algorithm to merge two arrays A[0..m-1] and B[0..n-1] into an array C[0..m+n-1] is as following:

  1. Introduce read-indices ij to traverse arrays A and B, accordingly. Introduce write-index k to store position of the first free cell in the resulting array. By default i = j = k = 0.
  2. At each step: if both indices are in range (i < m and j < n), choose minimum of (A[i], B[j]) and write it toC[k]. Otherwise go to step 4.
  3. Increase k and index of the array, algorithm located minimal value at, by one. Repeat step 2.
  4. Copy the rest values from the array, which index is still in range, to the resulting array.

Enhancements

Algorithm could be enhanced in many ways. For instance, it is reasonable to check, if A[m - 1] < B[0] orB[n - 1] < A[0]. In any of those cases, there is no need to do more comparisons. Algorithm could just copy source arrays in the resulting one in the right order. More complicated enhancements may include searching for interleaving parts and run merge algorithm for them only. It could save up much time, when sizes of merged arrays differ in scores of times.

Complexity analysis

Merge algorithm's time complexity is O(n + m). Additionally, it requires O(n + m) additional space to store resulting array.

Code snippets

Java implementation

// size of C array must be equal or greater than

// sum of A and B arrays' sizes

public void merge(int[] A, int[] B, int[] C) {

      int i,j,k ;

      i = 0;

      j=0;

      k=0;

      m = A.length;

      n = B.length;

      while(i < m && j < n){

          if(A[i]<= B[j]){

              C[k] = A[i];

              i++;

          }else{

              C[k] = B[j];

              j++;

       }

       k++;

       while(i<m){

         C[k] = A[i]

         i++;

         k++;

      }

      while(j<n){

         C[k] = B[j] 

         j++;

          k++;

 }


Python  implementation

def merege(left,right):

    result = []

    i,j = 0

   while i< len(left) and j < len(right):

        if left[i]<= right[j]:

            result.append(left[i])

            i = i + 1

        else:

            result.append(right[j])

            j = j + 1

    while i< len(left):

           result.append(left[i])

           i = i + 1

    while銆j< len(right):

           result.append(right[j])

           j = j + 1

    return result

  
MergSort:

import operator

def mergeSort(L, compare = operator.lt):
     if len(L) < 2:
          return L[:]
     else:
          middle = int(len(L)/2)
          left = mergeSort(L[:middle], compare)
          right= mergeSort(L[middle:], compare)
          return merge(left, right, compare)

def merge(left, right, compare):
     result = []
     i, j = 0, 0

     while i < len(left) and j < len(right):
          if compare(left[i], right[j]):
               result.append(left[i])
               i += 1
          else:
                result.append(right[j])
                j += 1
     while i < len(left):
          result.append(left[i])
          i += 1
     while j < len(right):
          result.append(right[j])
          j += 1
     return result
               



ytl 2011-05-06 16:55 鍙戣〃璇勮
]]>
Sorting algorithms --Selection Sorthttp://m.tkk7.com/ytl-zlq/archive/2011/05/06/349687.htmlytlytlFri, 06 May 2011 08:16:00 GMThttp://m.tkk7.com/ytl-zlq/archive/2011/05/06/349687.html

Selection Sort

Selection sort is one of the O(n2) sorting algorithms, which makes it quite inefficient for sorting large data volumes. Selection sort is notable for its programming simplicity and it can over perform other sorts in certain situations (see complexity analysis for more details).

Algorithm

The idea of algorithm is quite simple. Array is imaginary divided into two parts - sorted one and unsorted one. At the beginning, sorted part is empty, while unsorted one contains whole arrayAt every step, algorithm finds minimal element in the unsorted part and adds it to the end of the sorted one. When unsorted part becomes empty, algorithmstops.

When algorithm sorts an array, it swaps first element of unsorted part with minimal element and then it is included to the sorted part. This implementation of selection sort in not stable. In case of linked list is sorted, and, instead of swaps, minimal element is linked to the unsorted part, selection sort is stable.

Let us see an example of sorting an array to make the idea of selection sort clearer.

Example. Sort {5, 1, 12, -5, 16, 2, 12, 14} using selection sort.

Selection sort example

Complexity analysis

Selection sort stops, when unsorted part becomes empty. As we know, on every step number of unsorted elements decreased by one. Therefore, selection sort makes n steps (n is number of elements in array) of outer loop, before stop. Every step of outer loop requires finding minimum in unsorted part. Summing up, n + (n - 1) + (n - 2) + ... + 1, results in O(n2) number of comparisons. Number of swaps may vary from zero (in case of sorted array) to n - 1 (in case array was sorted in reversed order), which results in O(n) number of swaps. Overall algorithm complexity is O(n2).

Fact, that selection sort requires n - 1 number of swaps at most, makes it very efficient in situations, when write operation is significantly more expensive, than read operation.

Code snippets

Java

public void selectionSort(int[] arr) {

      int i, j, minIndex, tmp;

      int n = arr.length;

      for (i = 0; i < n - 1; i++) {

            minIndex = i;

            for (j = i + 1; j < n; j++)

                  if (arr[j] < arr[minIndex])

                        minIndex = j;

            if (minIndex != i) {

                  tmp = arr[i];

                  arr[i] = arr[minIndex];

                  arr[minIndex] = tmp;

            }

      }

}

Python

     for i in range(len(L)-1):
          minIndex = i
          minValue = L[i]
          j = i + 1
          while j< len(L):
               if minValue > L[j]:
                    minIndex = j
                    minValue = L[j]
               j += 1
          if minIndex != i:
               temp       = L[i]
               L[i]       = L[minIndex]
               L[minIndex] = temp




ytl 2011-05-06 16:16 鍙戣〃璇勮
]]>
主站蜘蛛池模板: 亚洲中字慕日产2021| 美女被爆羞羞网站在免费观看| 国产在线观看免费观看不卡| 精品亚洲国产成人av| 在线亚洲精品福利网址导航| 13一14周岁毛片免费| 精品亚洲福利一区二区| 亚洲精品无码永久中文字幕| 18禁超污无遮挡无码免费网站国产 | 希望影院高清免费观看视频| 黄色毛片免费在线观看| 久久综合图区亚洲综合图区| 图图资源网亚洲综合网站| 日韩精品无码区免费专区| 成年女人A毛片免费视频| 亚洲二区在线视频| 亚洲啪啪AV无码片| 性感美女视频免费网站午夜| 亚洲高清日韩精品第一区| 国产男女猛烈无遮档免费视频网站 | 免费国产成人高清在线观看麻豆 | 无码免费午夜福利片在线| 一二三区免费视频| 国产精品亚洲综合五月天| 久久精品免费一区二区喷潮| 和老外3p爽粗大免费视频| 亚洲熟妇无码AV| 久久久久亚洲AV片无码下载蜜桃| 亚洲AV伊人久久青青草原 | 一级A毛片免费观看久久精品 | 国产亚洲Av综合人人澡精品| 亚洲视频免费一区| 亚洲宅男天堂在线观看无病毒| 日本黄色免费观看| 91在线视频免费91| 免费无码中文字幕A级毛片| 国产精品hd免费观看| 亚洲AV无码专区在线观看成人 | 狠狠综合久久综合88亚洲| 日韩一级视频免费观看| 国产免费看JIZZ视频|