All Downloads are FREE. Search and download functionalities are using the official Maven repository.

.terrierteam.jtreceval.0.0.2.source-code.trec_eval-win-x86 Maven / Gradle / Ivy

MZ????@???	?!?L?!This program cannot be run in DOS mode.

$PELtv?V&*?'
( @?m? pH?? q?.text?
`P`.data$ @`?.rdata??@?&@@@.buildid5 ?@0@/40!0"?@[email protected]?`?p?.idataHp @0?.rsrc??&@0?/14h
?,@B/29P2?48@B/41UR?Tl@B/55?X@Z?@B/67e?@B/78??"@B/89H?$@BU???????$$A?K?D$?D$?$?O?D$?D$?$?C?D$?D$?$?7?D$?D$?$?+?Ð????????U??WVS??,?5?qB?$@A?փ????????$@A??qB??qB????bB?D$@A?$?U??҃??NjU??D$%@A?$?҃?? A??t?D$`B?$80B?ס 6A??t8?$=@A?փ????t?D$K@A?$??qB???…?t	?$ 6A???D$ A?D$?$?@???e?[^_]É???'? A??i???????U???? A??t	?$80B?С?bB??t?$??qB???ÐU???(?E??E??D$?E?$?U?E?}?u9?E???E?@?U?????`@A?X?E??E?@;E?ڸ??E????E?@?U???Ћ?D$?E?$??E?}?u#?E?@?U?????`@A?X?E???E??@????z
??????t8??؋E?@?U???‹E?@?M?????@?E??@??E??@???Z?X?E?@?U?????@?E??@??z????u?E?@?U???????X?"???؋E?@?U?????`@A?X?E??E??E?@;E???????E???U???8?E?@?E?E??E??E??r?U?E?Љ???????E?U???????‹E?Ћ?D$?E?$???E?}?y?E???E??&?}?~?E???E???U???????‹E???
?E?;E?~????U???8?E?@?E?E?@?E??E??r?U?E?Љ???????E?U???????‹E?Ћ?D$?E?$???E?}?y?E???E??&?}?~?E???E???U???????‹E???
?E?;E?~???Ð??U??VS??`?E?@??~?E??D$?$t@A?\??E?? A?T$?$?~???uu?$`B?E??(`B?E?P? `B?E?P?`B?E?P?
@`B?H`B? `B?ËE?H?E?P?E?X?E?@??~?$?@A?????	?E?@?D$?$?@A?????t?E?@?D$?$?@A?????u?E?@?D$?$?@A????t2???@?D$?D$k?D$?$?@A?????????'	?E??$?????E?? A?D$?U?T$?D$`B?$?z?? A? A??u
????????M?E?? A?L$?T$?$?)??E?@?E??E?@?E??E???h`B?D$?T$?D$l`B?$?
??h`B?h`B??th?E???8`B?D$?T$?D$<`B?$????8`B?8`B??t4?E???0`B?D$?T$?D$4`B?$???0`B?0`B??u
???????h`B?D$`B?D$ `B?D$?E??D$?E??D$?E?$?????u
????????E?DAA?E?DAA?$`B?E??E???
h`B?U???????ȋ?D$?E??$?????tD?
h`B?U???????ȋ?E??
h`B?U???????ȋ@?E?$`B???$`B?J?
h`B?U???????ȋ@?D$?E?$?`???t?
h`B?U???????ȋ@?E?E??E??E??;E??6????$`B?(`B?D$<?T$?D$,`B?$?0??(`B?(`B??tr? `B? `B?С@`B?D$?T$?D$D`B?$????@`B?@`B??t5? `B?H`B?D$?T$?D$L`B?$???H`B?H`B??u
???????}???? `B?E?¡``B?D$?T$?D$d`B?$?p??``B?``B??t}? `B? `B???E?¡P`B?D$?T$?D$T`B?$?-??P`B?P`B??t:? `B?E?¡X`B?D$?T$?D$\`B?$????X`B?X`B??u
??????Q?8`B?Eܡ``B?Eء0`B?EԡP`B?EСX`B?E̋$`B?E??(`B?E?P? `B?E?P?`B?E?P?
@`B?H`B? `B?ËE?H?E?P?E?X?E???$?|?E??E??E??E???FA?]?h`B??E??E?DAA?E???
h`B?U???????ȋ?D$?E??$???????}????(`B?E???????)?????@?(`B?E???????)???Ћ `B?ыUЉP?ỦP?H? `B? `B??EС `B??E̋(`B?E???????)???‹E؉B? `B??E؋(`B?E???????)??ȍ?E?+Eȉ??5h`B?Uȉ????????E?D$?\$?L$?$?p???????????]?(`B?E???????)???‹E܉?Uĉ?????E܋(`B?E???????)???‹EĉB?(`B?E???????)??ȍ?E?+Eȉ??5h`B?Uȉ????????E?D$?\$?EԉD$?L$?$?????u
???????E?+E???Eԋ
h`B?U???????ȋ?E??E??E?DAA?E?E??E??E???FA?]??
h`B?U???????ȋ@?D$?E?$????t?E??
h`B?U???????ȋ@?E?
h`B?U?????????@?E???z?E?????t#??؃E??
h`B?U?????????@?]??E??E??;E??8????}????(`B?E???????)?????@?(`B?E???????)???Ћ `B?ыUЉP?ỦP?H? `B? `B??EС `B??E̋(`B?E???????)???‹E؉B? `B??E؋(`B?E???????)??ȍ?E?+Eȉ??5h`B?Uȉ????????E?D$?\$?L$?$???????????????(`B?E???????)???‹E܉?Uĉ?????E܋(`B?E???????)???‹EĉB?(`B?E???????)??ȍ?E?+Eȉ??5h`B?Uȉ????????E?D$?\$?EԉD$?L$?$?.???u???????E?@??~?E?$?????`[^]?U??S??$?E??E?E?E??E?@?]?E??E???E??@?E?U??P?E???U????????‹E??@?E???z?E?????t]??؋U????????‹E??@?]?E?@??E??E??U????????‹E??@?E???E??@?E?U??P?E?X?E?@?H?U?J????U????????‹EЋ@??E??E?;E?8????E?D$?E?$????u?????????$[]?U??VS??0?E?@?E?E?@?E??E?@?E?@?E?@ ?E?@$?E?@(?E?@,?E?@0?E?@4?E?@8?E???E??U??????ȋ@?E???E?;E?}?E?P?aB???aB?T$?D$?$?R??E????aB?U???U???ЉEЋE؋@?U???Ћ@?E?E؋@?U???ЋP?E؋@?M???ȋ??ЉE??E????E?@??x?E?@???E?Ћ????E??E?;E?s$? aB?U???Ћ?E??T$?$?????x??E?;E?s$? aB?U???Ћ?E??T$?$?????~? aB?U?????@?????]?E?@??y? aB?U?????@?????? aB?U???‹E?@?B?E?@??x?E?@???E?Ћ????E??E??E?;E??%????'?E?@??x?E?@???E?Ћ????E??E?;E?rѡaB?U??????aB?U?????@?aB?U?????@?aB?U???‹
aB?E??E???ȉB?aB?U???ЋaB?M???M???ʉP?E?@????U? aB?D$?S@?D$?T$?$?7??E??E??e? aB?U???ЋP?E?@ 9?|?aB?U???Ћ????aB?U???ЋH?E??P?U????? aB?E?P?U???؋@??E?;E?}? aB?U???Ћ@???{????aB?U???‹E??B?aB??;E???U? aB?D${T@?D$?T$?$?Z????E????aB?U???Ћ@? aB?M???ʋR????????¡ aB?M???ȋ@?? aB?U???Ћ@???u?aB?U???ЋP???P?[? aB?U???Ћ@???u?aB?U???ЋP???P?.? aB?U???ЋP?E?@ 9?|?aB?U???Ћ????E??E?;E??*????aB?U???‹E??B?aB?U?????@?E???aB?U???Ћ@?U???Ћ??t_?aB?U???ЋU???P?E?@ ;E?>?aB?U???ЋaB?M???ʋJ?aB?]???ڋR?]???ڋʉP?E??E?;E??s????E??E؋;E??#????E??E??aB?E?P?aB?E?P???D[]?U??E?@??x??E?@??x5?E?P?E?@9?}??????E?E?P?E?@9?~??.??'?E?@??x???????E?@??x???]?U????E?@?E?@??????v??????/?E?@?E?@????v???E??E??T$?$?Ͼ??U????E??E??T$?$豾??U????aB??~!? A?$???aB? A HA?aB??~?aB?$?ɽ?aB?aB??~!?aB?$詽?aB?aB?aB??~?aB?$???aB?$aB??~? aB?$?_??$aB??Ð?U??VS??`?E??D$?E?$芽?EЃ}?????D$?D$?D$?EЉ$?*??E?}?~}?E???$???(aB?(aB??ta?D$?D$?D$?EЉ$???у?????	ȅ?t0?U??(aB?T$?D$?EЉ$??;E?u?EЉ$?I????u)???@?U?T$?D$?oA?$?E????????(aB?U????<
t?(aB?E???
?E??(aB?E????E??(aB?E???E??E??D$
?$?????EȋE????uًU????????$???Ẽ}?u
??????"?ẺE?(aB?E???E?p?E?X?E?H?E?P?E?t$?\$?L$?T$?D$?Eȉ$?#???u?U????????$臺?,aB?,aB??t?E????$?k??0aB?0aB??u
???????4aB?Eܡ,aB?Eء0aB?E??E?=pA?E???U????????‹E?Ћ?D$?E??$?k???tb?}?t?UԋE؋@)‰????‹E؉?E??E??U????????‹E?Ћ?E??E؋UԉP?E܋U???E??@>pA?E܋U؉P?U????????‹E?ЋP?Eԉ?U????????‹E?ЋP?EԉP?U????????‹E?Ћ@?$?͸?]??EċE??X?U????????‹E?ЋP?EԉP?E??E??E?;E???????UԋE؋@)‰????‹E؉?E?U???4aB?E?P?Ẻ$蚸???`[^]?U???(?E??E??T$?$???E?}?t?E???E?P?E?@?T$?$????U????E??E???E??E??<
t"?qB??E??????????????uЋE?U????E??qB??E??????????????tڋE??<
u
??????z?E??P?U????E??E??<
t"?qB??E??????????????uЋE?U????E??qB??E??????????????tڋE??<
u
????????E??P?U????E??E??<
t"?qB??E??????????????uЋE?U????E??qB??E??????????????tڋE??<
u
??????v?E??P?U????E??E??<
t"?qB??E??????????????uЋE?U????E??qB??E??????????????tڋE??<
u
????????E??P?U????E??E??<
t"?qB??E??????????????uЋE??<
u
???????E?U????E??qB??E??????????????tڋE??<
tO?E??P?U????E??E??<
t"?qB??E??????????????uЋE??<
t???????E??P?U???E?U?????U????(aB??t?(aB?$?.??(aB?,aB??t?,aB?$???,aB?0aB??t?0aB?$???0aB?4aB??t?4aB?$?δ?4aB??ÐU??S??D?E??D$?E?$????EЃ}?????D$?D$?D$?EЉ$蛴?E?}?~}?E???$茴?8aB?8aB??ta?D$?D$?D$?EЉ$?S??у?????	ȅ?t0?U??8aB?T$?D$?EЉ$肴;E?u?EЉ$躳???u)舳?@?U?T$?D$DpA?$足??????Y?8aB?U????<
t?8aB?E???
?E??8aB?E????E??8aB?E???E??E??D$
?$?x????EȋE????uًU???????$?t??Ẽ}?u
????????ẺE?8aB?E??k?E?H?E?P?E?L$?T$?D$?Eȉ$?????u?U????????$????E?P?E?@?T$?$諧?E?}?t?E???E?P?E?@?T$?$腧??U????E??E???E??E??<
t"?qB??E??????????????uЋE?U????E??qB??E??????????????tڋE??<
u
????????E??P?U????E??E??<
t"?qB??E??????????????uЋE?U????E??qB??E??????????????tڋE??<
u
??????v?E??P?U????E??E??<
t"?qB??E??????????????uЋE?U????E??qB??E??????????????tڋE??<
u
????????E??P?U????E??E??<
t"?qB??E??????????????uЋE??<
u
???????E?U????E??qB??E??????????????tڋE??<
tO?E??P?U????E??E??<
t"?qB??E??????????????uЋE??<
t???????E??P?U???E?U?????U????HaB??t?HaB?$?I??HaB?LaB??t?LaB?$?)??LaB?PaB??t?PaB?$?	??PaB?TaB??t?TaB?$???TaB?XaB??t?XaB?$?ɣ?XaB???U??S??d?E??D$?E?$????EЃ}?????D$?D$?D$?EЉ$藣?E?}?~}?E???$舣?\aB?\aB??ta?D$?D$?D$?EЉ$?O??у?????	ȅ?t0?U??\aB?T$?D$?EЉ$?~?;E?u?EЉ$趢???u)脢?@?U?T$?D$?U????????$???`aB?`aB??t?E????$???daB?daB??u
??????P?haB?Eܡ`aB?EءdaB?E??E?qA?E????E????‹E?Ћ?D$?E??$?
???t[?}?t?UԋE؋@)‰????‹E؉?E??E??E????‹E?Ћ?E??E؋UԉP?E܋U???E??@?qA?E܋U؉P?E????‹E?ЋP?Eԉ?E??@?qA?E????‹E?Ћ@?$蔟?]??EċE??X?E????‹E?ЋP?EԉP?E??E??E?;E??????UԋE؋@)‰????‹E؉?E?U???haB?E?P?Ẻ$?h????d[]?U???(?E??E??T$?$???E?}?t?E???E?P?E?@?T$?$軟??U????E??E???E??E??<
t"?qB??E??????????????uЋE?U????E??qB??E??????????????tڋE??<
u
????????E??P?U????E??E??<
t"?qB??E??????????????uЋE?U????E??qB??E??????????????tڋE??<
u
??????v?E??P?U????E??E??<
t"?qB??E??????????????uЋE?U????E??qB??E??????????????tڋE??<
u
????????E??P?U????E??E??<
t"?qB??E??????????????uЋE??<
u
???????E?U????E??qB??E??????????????tڋE??<
tO?E??P?U????E??E??<
t"?qB??E??????????????uЋE??<
t???????E??P?U???E?U?????U????\aB??t?\aB?$???\aB?`aB??t?`aB?$?_??`aB?daB??t?daB?$????daB?haB??t?haB?$???haB??Ð?U??S??t?E??E??D$?E?$?D??EЃ}??to?D$?D$?D$?EЉ$???E?}?tC?D$?D$?EЉD$?D$?D$?E?D$?$?ś?Ẽ}??u)???@?U?T$?D$?qA?$?=???????r?E???$?m??laB?laB??u)?ʚ?@?U?T$?D$?qA?$?????????-?laB?U?T$?ỦT$?$?#??E?D$?Ẻ$?)????t?EЉ$葚???u)?_??@?U?T$?D$rA?$荚????????laB?U????<
t?laB?E???
?E??laB?E????E??laB?E???E??E??D$
?$?O????EċE????uًU???????$?K??Eȃ}?u
??????/?EȉE?laB?E????Eă??EċE????t,?E??<
t"?qB??E??????????????u??E??<
u?Eă??E??r?E?X?E?H?E?U??T$?\$?L$?D$?Eĉ$?????u?U????????$莘?paB?paB??t?E????$?r??taB?taB??u
??????Q?xaB?EܡpaB?EءtaB?E??E?{rA?E????U???????‹E?Ћ?D$?E??$?s???tj?}?t?UԋE؋@)‰????‹E؉?E??E??U???????‹E?Ћ?E??E؋UԉP?U??E܋M???E܉P?E??@|rA?E܋U؉P?U???????‹E?ЋP?Eԉ?U???????‹E?Ћ@?$???]??E??E??X?E??E??E?;E??????UԋE؋@)‰????‹E؉?E?U???xaB?E?P?Eȉ$?֖???t[]?U???(?E??E??T$?$?O??E?}?t?E???E?P?E?@?T$?$?)???U????E??E??E?U????E??qB??E??????????????tڋE??<
u
???????E??P?U????E??E??<
t(?qB??E??????????????u???E??qB??E??????????????tڋE??P?U??<
u??????&?E??E??<
t"?qB??E??????????????uЋE?U????E??qB??E??????????????tڋE??<
u
???????E??P?U????E??E??<
t(?qB??E??????????????u???E??qB??E??????????????tڋE??P?U??<
u??????2?E??E??<
t"?qB??E??????????????uЋE?U????E??qB??E??????????????tڋE??<
u
???????E??P?U????E??E??<
t"?qB??E??????????????uЋE??<
u??????o?E?U????E??qB??E??????????????tڋE??<
t?E??P?U????E??E??<
u?E??P?U???E?U?????U????laB??t?laB?$?q??laB?paB??t?paB?$?Q??paB?taB??t?taB?$?1??taB?xaB??t?xaB?$???xaB???U??S??T?E??D$?E?$????Eԃ}?????D$?D$?D$?Eԉ$?ߒ?E?}?~}?E???$?В?|aB?|aB??ta?D$?D$?D$?Eԉ$藒?у?????	ȅ?t0?U??|aB?T$?D$?Eԉ$?ƒ;E?u?Eԉ$??????u)?̑?@?U?T$?D$?rA?$????????? ?|aB?U????<
t?|aB?E???
?E??|aB?E????E??|aB?E???E??E??D$
?$輑???E̋E????uًE????$辑?EЃ}?u
???????EЉE?|aB?E??o?E?X?E?H?E?P?E?\$?L$?T$?D$?Ẻ$????u6?U?E?)‰????X????@?\$?D$?rA?$?????????E??E????u??U?E?)‰????E?E??D$΃@?D$?D$?EЉ$?8??E??E??;?E????‹E?Ћ?E?????????E?ȋ?T$?$????t?E??E??E?;E?|??U????????$艐??aB??aB??t"?U???????$?g???aB??aB??u
??????7??aB?Eء?aB?E??E??rA?E????E????‹E?Ћ?D$?E??$?v???tK?}?t"?U܋E؋@)‰???i??????‹E؉P?E??E????‹E?Ћ?E??E؋U???E؋U܉P?E????‹E?ЋP?E܉?E????‹E?Ћ@?$???E??X?E????‹E?Ћ@?$????E??X?E??E??E?;E??.????U܋E؋@)‰???i??????‹E؉P?E?U????aB?E?P?EЉ$?????T[]?U???(?E??E??T$?$?^??E?}?t?E???E?P?E?@?T$?$?8???U????E??E???E??E??<
t"?qB??E??????????????uЋE?U????E??qB??E??????????????tڋE??<
u
????????E??P?U????E??E??<
t"?qB??E??????????????uЋE?U????E??qB??E??????????????tڋE??<
u
??????v?E??P?U????E??E??<
t"?qB??E??????????????uЋE?U????E??qB??E??????????????tڋE??<
u
????????E??P?U????E??E??<
t"?qB??E??????????????uЋE??<
u
???????E?U????E??qB??E??????????????tڋE??<
tO?E??P?U????E??E??<
t"?qB??E??????????????uЋE??<
t???????E??P?U???E?U?????U????|aB??t?|aB?$????|aB??aB??t??aB?$?܋??aB??aB??t??aB?$輋??aB??Ð??U???x?E?@ ?@?E????]؋E?@ ?@??u2?'??@?D$?D$+?D$?$?uA?d?????????E??D$?E?D$?E?D$?E?$膹?????u
???????E?@ ?@???$?G??EЃ}?u
???????E??R?E???E?‹E???E????E??E??E???? vA???}??E??f?E??m??]??m??E???E??E?@ ?@;E???E?@ ?@???E???m??}?x?E???E?Ћ?E?9?ދE??E??E??E??E??E????]??E??]??E??E??E??E??o?E??E????]??E??E???????v?E??]??EċU????????Ћ?E?@ 9?|/?
?E??E??]؃m??}?x?E???E?Ћ;E?t׃m??m??}?~?}???
?E??E??]؃m??}?y?E?P?E?@$??‹E?@ ?@?E??E??E????Z?EЉ$?r???Ð??U???h?EȉD$?E?D$?E?D$?E?$薷?????u
???????E????]??E??m?E?U???Ћ?E?@ 9?|R?E??E??+E?E??E??$???????E????]??E?@??~!?E??\$?E?D$?E?D$?$?wA???E??E?;E???}?t"?E?P?E?@$??ЋU؉U??E??E????X??Ð?U???h?E????]؍E??D$?E?D$?E?D$?E?$蚶?????u
????????E???EЋU???ЋE?E??E?@ ;E???E??E??E??E???EԋU???Ћ??????EԋU???Ћ???u	?E???EԋU???Ћ??x?EԋU???Ћ?E?@ 9?}?E??S?E??}?~????Eȅ?t?EȉE??E??E????]؋E?P?E?@$????E??X??Ð?U?????E??D$?E?D$?E?D$?E?$???????u
??????u?E??D$?E??D$?E?$?^???u
??????M???]????]????]????]؋E????Ẽ}?x?E??U?????@????]??E????]??E??0?E??U???Ћ?U??T$?$?$?]??E??E??]??E??&?E??m??}?x?E??U?????@????]?}?x?E??U???Ћ@;E?|??E?????????v	?E??E??]??E??E???????r?E??E??]??	?E??E??]??E???????z?E???????t#?E??A???e??$蛅?E????E????]ЋE?@??~D?E??\$4?E??\$,?E??\$$?E??\$?E??\$?E??\$?ẺD$?EĉD$?$?~A者?E??E?;E????E????????????????E??U???Ћ?U??T$?$???]??E??E??]??E??E??]??E???????z?E???????t#?E??A???e??$轄?E????E????]ЋE?@??~????Eȅ?t?EȉE??E??E????]؋E?P?E?@$????E???A??????v?E????A?$?A?[?E?@??~+?E?P?E?@$????@?\$?E??\$?$?A?P???t[]ÐU??S??T?EȉD$?E?D$?E?D$?E?$?	??????u
???????E????]??E??8?E?U???Ћ?E?@ 9?|?E??E?E???E??E????E????]??E??E?;E???}?t?E؉E??E??E????]??E?P?E?@$????E????A??????v?E?????A?$?$~?[???T[]Ð?U???h???]??E??D$?E?D$?E?D$?E?$???????u
??????S?E??E??E??E????EԋU???Ћ??????EԋU???Ћ???u	?E???EԋU???Ћ??x ?EԋU???Ћ?E?@ 9?}	?E???E??}?u?E??????]??v?E??]??E??????????E??????E???E?P??E?‹E?ЉE??E??u??ɋE???E??E????A???E?P??E?ЉE??E????A?????????E????]????E??E?;E??????Eȅ?t?EȉE??E??E????]??E?P?E?@$????E??X??ÐU???x?E?@ ?@?E܍E??D$?E?D$?E?D$?E?$膪?????u
???????E?@ ?@???$?G|?E؃}?u
???????E??R?E???E?‹E???E????E??E??E??????A???}??E??f?E??m??]??m??E???E??E?@ ?@;E???E?@ ?@???E???m??}?x?E???E?Ћ?E?9?ދE??E??E??E??E??E????]??E??]??E??E??E??E???E??E????]??E??E???????v?E??]??E̋U????????Ћ?E?@ 9?|B? ?E?P?E?H$?E??????E??X?m??}?x?E???E?Ћ;E?tăm??m??}?~,?}??n???? ?E?P?E?H$?E??????E??X?m??}?yڋE؉$?mz??Ð?U???X?EȉD$?E?D$?E?D$?E?$蒨?????u
???????E????]??E??8?E?U???Ћ?E?@ 9?|?E??E?E???E??E????E????]??E??E?;E???}?t"?E?P?E?@$??ЋU؉U??E??E????X??Ð??U???H?E؉D$?E?D$?E?D$?E?$?>??????u
???????E???E????]??E??C?E??U???Ћ@?U???Ћ?E?@ 9?|?E??E?E???E??E????E????]??E??E??U???Ћ@;E???}?tC?E?P?E?@$??‹E?H?E?@$????@?E??M???ȋ@?E??E??E??????Z?E??E?;E???-?}?x?E??0?E???}??u?E?-??}??u?E?.??E??E?P?E?@$??ЋU؉U??EԋU؉U??E??ɋU܉U??EԋU?U??E??????X??Ð??U???H?E؉D$?E?D$?E?D$?E?$??[?????u??????4?E܅?t(?E?P?E?@$??ЋU؉U??EԋU܉U??E????X???U???H?E؉D$?E?D$?E?D$?E?$?j[?????u??????4?E??t(?E?P?E?@$??ЋU؉U??EԋU?U??E????X???U???H?E؉D$?E?D$?E?D$?E?$?[?????u??????P?E܅?tD?E??t=?E?P?E?@$??ЋU؉U??EԋM܋U?9?~?U?U??E??	?U܉U??E????X???U???H?E?@ ?@?E??E??E??EȉD$?E?D$?E?D$?E?$?dZ?????u
???????E??l?E???E?Ћ;E?u7?E?P?E?H$?E????Ѓ}?t??????X?E??E?@ ?@9E?t)?E?U???Ћ?E?@ 9?|?E??E??E?;E???,??)?E?P?E?H$?E????Ѓ}?t??????X?E??E?@ ?@;E?ɸ?ÐU???X?E?@ ?@?E?EԉD$?E?D$?E?D$?E?$?bY?????u
????????E?@ ?@??t2?*?@?D$?D$8?D$?$??A??*???????E?P?E?@$??‹E???EԉE??E??ɋE????M؋E?)??ȉE??E??????E????M?E?)??ȉE??E??????E????E?H?E???E?)??E?)??ȉE??E??????Z??Ð??U??S??T?EȉD$?E?D$?E?D$?E?$?]X?????u
???????E????]??E??8?E?U???Ћ?E?@ 9?|?E??E?E???E??E????E????]??E??E?;E???E?P?E?@$????E??????E؉E??E??????e????$?)?[???T[]?U??]?U??E?P?E?@$??‹E?H?E?@$????@?E?H?E?@$????@???Z?]?U??S???E??Q?E?P?E?H$?E????‹E?H?E?X$?E??????@?E?H?E?X$?E??????@???Z?E??E?@ ?@;E?????[]?U??]?U????E?@?E??E?@??t?E??E??}?t-?E?P?E?@$??‹E?H?E?@$????@?E????Z???U??S???E?@?E??E?@??t?E??E??}?tR?E??;?E?P?E?H$?E????‹E?H?E?X$?E??????@?E????Z?E??E?@ ?@;E?????[]?U??S??4?E?@?E?E?@??t?E??E?}?~i?E?P?E?@$????@?]??E?@??t#?E?@?U?)‰ЉE??E??`?A???E????]??E?P?E?@$????E??E????$?'?[???4[]Ð?U??E?@$?]?U????E?@?H?E?P?E?@?D$?L$?T$?$??%?‹E?P?E?@??u??????=?E?P?E?P$?E?P?E?@$??ЋU?????X?E?@?P?E?P???U????E?@?H?E?P?E?@?D$?L$?T$?$?e%?‹E?P?E?@??u??????=?E?P?E?P$?E?P?E?@$??ЋU?????X?E?@?P?E?P???U??S??$?E?@4??t\?E?@4?E??E?E??E???T$?$?T&??u'?E??P?E?@ ?T$?$?N???u???????E??E????u????E?@ ?@?E?E?P?E?@ ?@??E?P?E?@?D$?L$?T$?$?^$?‹E?P?E?@??u
???????E??g?E?P?E?H?E??????E???E?Ћ?E??T$?$??????[?E?P?E?H?E????Ћ??u??????:?E??E?@ ?@;E???E?P?E?P$?E?P?E?@ ?@‹E?P???$[]?U??S??$?E?@4??t\?E?@4?E??E?E??E???T$?$??$??u'?E??P?E?@ ?T$?$?????u???????E??E????u????E?@ ?@?E?E?P?E?@ ?@??E?P?E?@?D$?L$?T$?$??"?‹E?P?E?@??u
???????E??g?E?P?E?H?E??????E???E????E??\$?$?
????[?E?P?E?H?E????Ћ??u??????:?E??E?@ ?@;E???E?P?E?P$?E?P?E?@ ?@‹E?P???$[]?U??S??$?E?@4??t\?E?@4?E??E?E??E??T$?$?f#??u'?E?P?E?@ ?T$?$?@???u????????E??E???u????E?@?H?E?P?E?@?D$?L$?T$?$?!?‹E?P?E?@??u
???????E?P?E?P$?E?@ ???t4?E?P?E?@$????E?@ ??E??T$?$?	????[??E?P?E?@$??ЋU?????X?E?P?E?@??Ћ??u???????E?@?P?E?P???$[]?U??S??$?E?@4??t\?E?@4?E??E?E??E??T$?$?"??u'?E?P?E?@ ?T$?$?'???u????????E??E???u????E?@?H?E?P?E?@?D$?L$?T$?$?. ?‹E?P?E?@??u
???????E?P?E?P$?E?@ ???t4?E?P?E?@$????E?@ ??E??T$?$?b????[??E?P?E?@$??ЋU?????X?E?P?E?@??Ћ??u???????E?@?P?E?P???$[]?U??S??4?E??E?E???E??<,u?E??E??E????u?E?$? ???$?: ?‹E??E???t?E????$? ?E?}?u
???????E?$?n ?P?E??T$?U?T$?$?[ ?E?U?P?E?U?P?E?E??E??E?E??@?E??<,u2?E???E?P?U???E??E?$???E????E?E??E????u??E?P?U???E??E?$????E??D$K?@?D$?D$?E?$?u?E???/??@?D$?D$$?D$?$h?A????????t?E??^?E????????E?Ћ?E???E?ȋ9?u/?,?@?D$?D$&?D$?$??A?i???????E??E?;E?|????4[]?U??S??4?E??E?E???E??<,u?E??E??E????u?E?$????$?3?‹E??E???t?E????$??E?}?u
??????Z?E?$?g?P?E??T$?U?T$?$?T?E?U?P?E?U?P?E?E??E??E?E??@?E??<,u2?E???E?P?U???E??E?$???E????E?E??E????u??E?P?U???E??E?$????E??D$^?@?D$?D$?E?$?n?E??h?E???????E????E???E?????z5????u3?W?@?D$?D$&?D$?$??A??????????؃E??E?;E?|????4[]?U??S??$?E??E?E???E??<,u?E??E??E????u?E?$?????$?Z?‹E??E???t?E????$?<?E?}?u
???????E?$??P?E??T$?U?T$?$?{?E?E??E??E?E??@?E??<,u2?E???E?P?U???E??E?$?E??E????E?E??E????u??E?P?U???E??E?$???E?U?P?E?U?P???$[]?U??S??4?E??E?,?E?E??B?E??<,u?E??E??:E?t5?E?,??E??<=u?E??:E?u?E?,??E?=?E??E????u????}?=t)?`?@?U?T$?D$??A?$???????1?E?$?1???$??‹E??E???t?E????$??E?}?u
????????E?$???P?E??T$?U?T$?$???E?E??E??E?E??l?E??<=u#?E???E????‹E?‹E??E???E??;?E??<,u1?E???E?P?U????‹E??E?$?v?[?E???E?E??E????u??E?P?U????‹E??E?$?=?[?E?U?P?E?U?P???4[]?U??E??E?)‰?]?U??E??E?????v???????E??E???????v???]?U???8?E?$????E?E?$?&?E??}?u??+?E?U?T$?U?T$?D$??A?D$?E??$?N?E???U???H?E?E??E?E?E?$?<??
?E?E?$??E??}?u??+?E??E??\$?U?T$?D$??A?D$?E??$???E???U??S??4?}u?E?f?E?$???ËE?$??؃??E?E?$?A?E??}?u??+?E?U?T$?U?T$?D$??A?D$?E??$?i?E???4[]?U??]?U???(?E?@??t^?E?P?E?@$????@?E??E?H?E?@$??ȋ?M?I0??t???A????A?\$?T$?D$?L$?$??A????U??S??4?E?@?????E?@0??tF?E?P?E?@$????@?E??E?H?E?@$??ȋ?\$?T$?D$?$?A?N?]?E?P?E?@$????@?}??E??f?E??m??]??m??]??E??E?H?E?@$??ȋ?\$?T$?D$?$"?A?????4[]?U??S??4?E???E?@??th?E?P?E?H$?E??????@?E??E?H?E?X$?E????ȋ?M?I0??t???A????A?\$?T$?D$?L$?$??A?_?E?P?E?H$?E????Ћ?$???E??E?@ ?@;E??X????E?@ ???t!?E?@ ?@?$??E?@ ??$????4[]?U???(?E?@??t^?E?P?E?@$????@?E??E?H?E?@$??ȋ?M?I0??t???A????A?\$?T$?D$?L$?$??A??E?@ ???t
BJO
BNc
BM{
BR?
BT?
Bo?
BZcyggcc_s-1.dll__register_frame_info__deregister_frame_infocyggcj-16.dll_Jv_RegisterClasses??.?no queryDebug: Form_prefs starting query '%s'
Returned Cached Form_prefsprefsqrels_prefstrec_resultstrec_eval.form_prefs_info: prefs_info format not (prefs or qrels_prefs) or results format not trec_results
trec_eval.form_prefs_counts: Internal docid %ld occurs with different rel_level in same jsg
trec_eval.form_prefs_counts: doc '%s' has both 0 and non-0 rel_level assigned
trec_eval.form_prefs_counts: Pref inconsistency found
      internal rank %ld and internal rank %ld are conflicted
After input, before rankstrec_eval.form_prefs_counts: duplicate docs %sAfter -M, ranksInput, before ranksAfter marking not judgedAfter assigning docid_ranksForm_prefs: num_judged %ld, num_judged_ret %ld
Final prefsPrefs_and_ranks Dump.  num_pref_lines %ld,  %s
  %s	%s	%4.2f	%s	%3ld
Docno_results Dump.  num_results %ld, %s
  %s	%4.2f	%3ld
    EC Dump. Rel_level %4.2f. Num_docid_ranks %ld
      %3ld     Prefs_Array Dump. Num_judged %ld
      Row %3ld
          (%ld)
     %2hhd  Counts_Array Dump. Num_judged %ld
    Row %3ld
        (%ld)
   %2hd ECPrefs_array  JG Dump.  Type %s
    num_prefs_fulfilled_ret %ld
    num_prefs_possible_ret %ld
    num_prefs_fulfilled_imp %ld
    num_prefs_possible_imp %ld
    num_prefs_possible_notoccur %ld
    num_nonrel %ld
    num_nonrel_ret %ld
    num_rel %ld
    num_rel_ret %ld
    Rel_array Dump. %ld values%4.2f     JG is not initialized (0 ECs and no rel_array    Dump of %ld ECs within JG
Results_prefs Dump.  %ld Judgment Groups
  num_judged_ret %ld,  num_judged %ld
no_query@?no queryqrelstrec_resultstrec_eval.form_res_qrels: rel_info format not qrels or results format not trec_results
trec_eval.form_res_qrels: duplicate docs %strec_eval.form_res_rels: duplicate docs %s
no_queryno queryqrels_jgtrec_resultstrec_eval: rel_info format not qrels_jg or results format not trec_results
trec_eval.form_res_qrels: duplicate docs %sno_queryqrelsRel_info_file format: Standard 'qrels'
Relevance for each docno to qid is determined from rel_info_file, which 
consists of text tuples of the form 
   qid  iter  docno  rel 
giving TREC document numbers (docno, a string) and their relevance (rel,  
a non-negative integer less than 128, or -1 (unjudged)) 
to query qid (a string).  iter string field is ignored.   
Fields are separated by whitespace, string fields can contain no whitespace. 
File may contain no NULL characters. 
qrels_jgRel_info_file format: Standard 'qrels'
Relevance for each docno to qid is determined from rel_info_file, which 
consists of text tuples of the form 
   qid  ujg  docno  rel 
giving TREC document numbers (docno, a string) and their relevance (rel,  
a non-negative integer less than 128, or -1 (unjudged)) 
to query qid (a string) for a particular user judgment group. 
This allows averaging (or other operations) of appropriate evaluation measures
across multiple users, whoc may differ in their judgments. 
Fields are separated by whitespace, string fields can contain no whitespace. 
File may contain no NULL characters. 
prefsRel_info_file format: Non-standard 'prefs'
Preferences of user(s) for docs for a given qid is determined from
text_prefs_file, which consists of text tuples of the form
   qid  ujg  ujsubg  docno  rel_level
giving TREC document numbers (docno, a string) and their relevance
level (rel_level,a non-negative float) to query qid (a string) for a 
user judgment sub-group (ujsubg, a string) within a user judgment
group (ujg, a string).
Fields are separated by whitespace, string fields can contain no whitespace.
File may contain no NULL characters.

Preferences are indicated indirectly by comparing rel_level of
different docnos within the same user judgment sub group(JSG).  A
judgment sub group establishes preferences between all docnos with
non-tied rel_levels within the group. Except possibly for 0.0, the
actual values of rel_level are ignored by default; they only serve to
establish a ranking within the JSG.

If a user only expresses a preference between two docs, then that user JSG
will have 2 lines in text_prefs_file:
      qid1  ujg1  sub1 docno1  3.0
      qid1  ujg1  sub1 docno2  2.0

If a user completely ranks some small number N (5-10) of docs, then N lines 
are used.
For example:
      qid1  ujg1  sub1  docno1  3.0
      qid1  ujg1  sub1  docno2  2.0
      qid1  ujg1  sub1  docno3  0.0
      qid1  ujg1  sub1  docno4  6.0
      qid1  ujg1  sub1  docno5  0.0
      qid1  ujg1  sub1  docno6  2.0
establishes a total of 13 preferences (5 with docno4 preferred, 4 with docno1 
preferred, 2 each with docno2 and docno6 preferred).

If a given user has multiple preferences that aren't complete, the preferences
are expressed in multiple JSGs within a single JG.
For example:
      qid1  ujg1  sub1  docno1  3.0
      qid1  ujg1  sub1  docno2  2.0
      qid1  ujg1  sub1  docno3  1.0
      qid1  ujg1  sub2  docno1  2.0
      qid1  ujg1  sub2  docno2  1.0
      qid1  ujg1  sub2  docno4  3.0
expressses 5 preferences (1>2, 1>3, 2 > 3, 4>1, 4>2).  Note the duplicate
1 > 2 is not counted as a separate preference

Multiple users are indicated by different JGs.
For example:
      qid1  ujg1  sub1  docno1  3.0
      qid1  ujg1  sub1  docno2  2.0
      qid1  ujg2  sub1  docno1  0.0
      qid1  ujg2  sub1  docno3  6.0
      qid1  ujg2  sub1  docno4  2.0
      qid1  ujg2  sub2  docno1  0.0
      qid1  ujg2  sub2  docno2  8.0
expressses 5 preferences (1>2, 3>1, 4>1, 3>4, 2>1).

A Judgment Group (JG) conceptually represents preferences for a single
information need of a user at a single time.  Within a single JG, it
is an error if there are inconsistencies (doc A > doc B in one JSG,
but B > A or B == A in another).  The different JSGs within a JG are
just a mechanism tha allows expressing partial ordering within a JG.
Within a single JG, preferences are transistive:
      qid1  ujg1  sub1  docno1  3.0
      qid1  ujg1  sub1  docno2  2.0
      qid1  ujg1  sub1  docno3  1.0
      qid1  ujg1  sub2  docno2  5.0
      qid1  ujg1  sub2  docno4  4.0
expresses 5 preferences (1>2, 1>3, 2>3, 2>4, 1>4).  There is no
preference expressed between 3 and 4.

Different JGs may contain contradictory preferences, as in an earlier
example.  These disagreements are realistic and desirable: users (or
even the same user at different times) often do not agree with each
other's preferences.  Individual preference evaluation measures will
handle these contradictions (or confirmations) in different ways.

A rel_level of 0.0 by convention means that doc is non-relevant to the
topic (in that user's opinion).  it is an inconsistency (and an error)
if a doc is assigned a rel_level of 0.0 in one JSG, but a different
rel_level value in another JSG of the same JG.  Some preference
evaluation measures may handle 0.0 differently.  Thus when converting
a preference file in some other format into text_prefs format, do not
assign a rel_level of 0.0 to a docno unless it is known that docno was
considered nonrelevant.

Handling of rel_level 0.0 separately addresses the general problem
that the number of nonrelevant docs judged for a topic can be critical
to fair evaluation - adding a couple of hundred preferences involving
nonrelevant docs (out of the possibly millions or billions in a
collection) can both change the importance of the topic when averaging
and even change whether system A scores better than system B on a
topic (even given identical retrieval on the added nonrel docs).  How
to handle this correctly for preference evaluation will be an
important future research problem.
qrels_prefsRel_info_file format: Non-standard 'qrels_prefs'
The file format is exactly the same as rel_info_file format 'qrels',
however it is interpreted as a restricted 'prefs' rel_info_file.
It cannot represent some user preferences (in particular, if a single user
prefers Doc A to Doc B, and A to C, but does not express a preference
between A and C) , but it allows the standard TREC qrels file to serve as 
input for preference evaluation measures.

Read all relevance preference information from text_qrels_prefs_file.
Preferences of user(s) for docs for a given qid is determined from
text_prefs_file, which consists of text tuples of the form
   qid  ujg   docno  rel_level
giving TREC document numbers (docno, a string) and their relevance
level (rel_level,a non-negative float) to query qid (a string) for a 
 user judgment group (ujg, a string).
Fields are separated by whitespace, string fields can contain no whitespace.
File may contain no NULL characters.

Preferences are indicated indirectly by comparing rel_level of
different docnos within the same user judgment group(JG).  A
judgment group establishes preferences between all docnos with
non-tied rel_levels within the group. Except possibly for 0.0, the
actual values of rel_level are ignored by default; they only serve to
establish a ranking within the JSG.

If a user only expresses a preference between two docs, then that user JSG
will have 2 lines in text_prefs_file:
      qid1  ujg1   docno1  3.0
      qid1  ujg1   docno2  2.0

If a user completely ranks some small number N (5-10) of docs, then N lines 
are used.
For example:
      qid1  ujg1    docno1  3.0
      qid1  ujg1    docno2  2.0
      qid1  ujg1    docno3  0.0
      qid1  ujg1    docno4  6.0
      qid1  ujg1    docno5  0.0
      qid1  ujg1    docno6  2.0
establishes a total of 13 preferences (5 with docno4 preferred, 4 with docno1 
preferred, 2 each with docno2 and docno6 preferred).

A Judgment Group (JG) conceptually represents preferences for a single
information need of a user at a single time.  Within a single JG, it
is an error if there are inconsistencies (doc A > doc B in one JSG,
but B > A or B == A in another).

Different JGs may contain contradictory preferences, These
disagreements are realistic and desirable: users (or even the same
user at different times) often do not agree with each other's
preferences.  Individual preference evaluation measures will handle
these contradictions (or confirmations) in different ways.

A rel_level of 0.0 by convention means that doc is non-relevant to the
topic (in that user's opinion).  Some preference evaluation measures
may handle 0.0 differently.  Thus when converting a preference file in
some other format into text_prefs format, do not assign a rel_level of
0.0 to a docno unless it is known that docno was considered
nonrelevant.

Handling of rel_level 0.0 separately addresses the general problem
that the number of nonrelevant docs judged for a topic can be critical
to fair evaluation - adding a couple of hundred preferences involving
nonrelevant docs (out of the possibly millions or billions in a
collection) can both change the importance of the topic when averaging
and even change whether system A scores better than system B on a
topic (even given identical retrieval on the added nonrel docs).  How
to handle this correctly for preference evaluation will be an
important future research problem.
trec_resultsResults_file format: Standard 'trec_results'
Lines of results_file are of the form 
     030  Q0  ZF08-175-870  0   4238   prise1 
     qid iter   docno      rank  sim   run_id 
giving TREC document numbers (a string) retrieved by query qid  
(a string) with similarity sim (a float).  The other fields are ignored, 
with the exception that the run_id field of the last line is kept and 
output.  In particular, note that the rank field is ignored here; 
internally ranks are assigned by sorting by the sim field with ties  
broken deterministicly (using docno). 
Sim is assumed to be higher for the docs to be retrieved first. 
File may contain no NULL characters. 
Lines may contain fields after the run_id; they are ignored. 
Process for evaluating qrels and trec_resultsProcess for evaluating qrels_jg and trec_resultsProcess for evaluating prefs and trec_results   Copyright (c) 2008 - Chris Buckley. 

   Permission is granted for use and modification of this file for
   research, non-commercial purposes. 

   Process for evaluating qrels_prefs and trec_resultstrec_eval.get_prefs: Cannot read prefs file '%s'
trec_eval.get_prefs: Malformed line %ld
prefstrec_eval.get_qrels: Cannot read qrels file '%s'
trec_eval.get_qrels: Malformed line %ld
qrelstrec_eval.get_qrels: Cannot read qrels file '%s'
trec_eval.get_qrels_jg: Malformed line %ld
trec_eval.get_qrels: duplicate docs %s
qrels_jgtrec_eval.get_prefs: Cannot read prefs file '%s'
trec_eval.get_qrels_prefs: Malformed line %ld
prefs0trec_eval.get_results: Cannot read results file '%s'
trec_eval.get_results: Cannot copy results file '%s'
trec_eval.get_results: Cannot close results file '%s'
trec_eval.get_results: Malformed line %ld
trec_resultstrec_eval.get_zscores: Cannot read zscores file '%s'
trec_eval.get_zscores: Malformed line %ld
11pt_avg    Interpolated Precision averaged over 11 recall points
    Obsolete, only use for comparisons of old runs; should use map instead.
    Average interpolated at the given recall points - default is the
    11 points being reported for ircl_prn.
    Both map and 11-pt_avg (and even R-prec) can be regarded as estimates of
    the area under the standard ircl_prn curve.
    Warning: name assumes user does not change default parameter values:
    measure name is independent of parameter values and number of parameters.
    Will actually average over all parameter values given.
    To get 3-pt_avg as in trec_eval version 8 and earlier, use
      trec_eval -m 11-pt_avg.0.2,0.5,0.8 ...
    Default usage: -m 11-pt_avg.0.0,.1,.2,.3,.4,.5,.6,.7,.8..9,1.0
trec_eval.calc_m_11ptavg: No cutoff values
????????binG    Binary G
    Experimental measure. (4/10/2008)
    G is a gain related measure that combines qualities of MAP and NDCG.
    G(doc) == rel_level_gain (doc) / log2 (2+num_nonrel retrieved before doc)
    G is the average of G(doc) over all docs, normalized by
    sum (rel_level_gain).
    BinG restricts the gain to either 0 or 1 (nonrel or rel), and thus is the
    average over all rel docs of (1 / log2 (2+num_nonrel before doc))
binG: %ld %ld %6.4f
bpref    Main binary preference measure.
    Fraction of the top R nonrelevant docs that are retrieved after each
    relevant doc. Put another way: when looking at the R relevant docs, and
    the top R nonrelevant docs, if all relevant docs are to be preferred to
    nonrelevant docs, bpref is the fraction of the preferences that the
    ranking preserves.
    Cite: 'Retrieval Evaluation with Incomplete Information', Chris Buckley
    and Ellen Voorhees. In Proceedings of 27th SIGIR, 2004.
G    Normalized Gain
    Experimental measure 4/10/2008
    G is a gain related measure that combines qualities of MAP and NDCG.
    Contribution of doc doc retrieved at rank i is 
    G(doc) == gain (doc) / log2 (2+ideal_gain(i)-results_gain(i))
    where results_gain(i) is sum gain(doc) for all docs before i
    and ideal_gain is the maximum possible results_gain(i)
    G is the sum of G(doc) over all docs, normalized by max ideal_gain.
    Gain values are set to the appropriate relevance level by default.  
    The default gain can be overridden on the command line by having 
    comma separated parameters 'rel_level=gain'.
    Eg, 'trec_eval -m G.1=3.5,2=9.0,4=7.0 ...'
    will give gains 3.5, 9.0, 3.0, 7.0 for relevance levels 1,2,3,4
    respectively (level 3 remains at the default).
    Gains are allowed to be 0 or negative, and relevance level 0
    can be given a gain.
    The idea behind G is that the contribution of a doc retrieved at i
    should not be independent of the docs before. If most docs before have
    higher gain, then the retrieval of this doc at i is nearly as good as 
    possible, and should be rewarded appropriately
G: %ld %ld %3.1f %6.4f %3.1f %6.4f %6.4f %6.4f
G: %ld %ld %3.1f %6.4f %3.1f %6.4f %6.4f
G: %ld %ld %3.1f %6.4f %3.1f %6.4f
@gm_bpref   Binary preference (bpref), but using goemetric mean over topics
    See the explanation for 'bpref' for the base measure for a single topic.
    Gm_bpref uses the geometric mean to combine the single topic scores.
    This rewards methods that are more consistent across topics as opposed to
    high scores for some topics and low scores for others.
    Gm_bpref is printed only as a summary measure across topics, not for the
    individual topics.
gm_bpref: bpref %6.4f, gm_bpref %6.4f?h㈵??>gm_map    Geometric Mean Average Precision
    This is the same measure as 'map' (see description of 'map') on an
    individual topic, but the geometric mean is calculated when averaging
    over topics.  This rewards methods that are more consistent over topics
    as opposed to methods which do very well for some topics but very poorly
    for others.
    gm_ap is reported only in the summary over all topics, not for individual
    topics.
?h㈵??>infAP    Inferred AP
    A measure that allows sampling of judgement pool: Qrels/results divided
    into unpooled, pooled_but_unjudged, pooled_judged_rel,pooled_judged_nonrel.
    My intuition of infAP:
    Assume a judgment pool with a random subset that has been judged.
    Calculate P at rel doc using only the judged higher retrieved docs,
    then average in 0's from higher docs that were not in the judgment pool.
    (Those in the pool but not judged are ignored, since they are assumed
    to be relevant in the same proportion as those judged.)
    Cite:    'Estimating Average Precision with Incomplete and Imperfect
    Judgments', Emine Yilmaz and Javed A. Aslam. CIKM 
?h㈵??>?h㈵??>iprec_at_recall    Interpolated Precision at recall cutoffs.
    This is the data shown in the standard Recall-Precision graph.
    The standard cutoffs and interpolation are needed to average data over
    multiple topics; otherwise, how is a topic with 5 relevant docs averaged
    with a topic with 3 relevant docs for graphing purposes?  The Precision 
    interpolation used here is
      Int_Prec (rankX) == MAX (Prec (rankY)) for all Y >= X.
    Default usage: -m iprec_at_recall.0,.1,.2,.3,.4,.5,.6,.7,.8,.9,1 ...
????????map    Mean Average Precision
    Precision measured after each relevant doc is retrieved, then averaged
    for the topic, and then averaged over topics (if more than one).
    This is the main single-valued number used to compare the entire rankings
    of two or more retrieval methods.  It has proven in practice to be useful
    and robust.
    The name of the measure is unfortunately inaccurate since it is 
    calculated for a single topic (and thus don't want both 'mean' and
    'average') but was dictated by common usage and the need to distiguish
    map from Precision averaged over topics (I had to give up my attempts to
    call it something else!)
    History: Developed by Chris Buckley after TREC 1.
    Cite: 'Retrieval System Evaluation', Chris Buckley and Ellen Voorhees.
    Chapter 3 in TREC: Experiment and Evaluation in Information Retrieval
    edited by Ellen Voorhees and Donna Harman.  MIT Press 2005
map_avgjg    Mean Average Precision over judgment groups 
    Precision measured after each relevant doc is retrieved, then averaged
    for the topic, and then averaged over judgement group (user) and then 
    averaged over topics (if more than one).
    Same as the workhorse measure 'map' except if there is more than one
    set of relevance judgments for this query (each set indicated by a
    different judgment group), the score will be averaged over the judgment
    groups.
map_cut    Mean Average Precision at cutoffs
    Map measured at various doc level cutoffs in the ranking.
    If the cutoff is larger than the number of docs retrieved, then
    it is assumed nonrelevant docs fill in the rest.
    Map itself is precision measured after each relevant doc is retrieved,
    averaged over all relevant docs for the topic.
    Cutoffs must be positive without duplicates
    Default param: -m map_cut.5,10,15,20,30,100,200,500,1000
ndcg    Normalized Discounted Cumulative Gain
    Compute a traditional nDCG measure according to Jarvelin and
    Kekalainen (ACM ToIS v. 20, pp. 422-446, 2002)
    Gain values are set to the appropriate relevance level by default.  
    The default gain can be overridden on the command line by having 
    comma separated parameters 'rel_level=gain'.
    Eg, 'trec_eval -m ndcg.1=3.5,2=9.0,4=7.0 ...'
    will give gains 3.5, 9.0, 3.0, 7.0 for relevance levels 1,2,3,4
    respectively (level 3 remains at the default).
    Gains are allowed to be 0 or negative, and relevance level 0
    can be given a gain.
    Based on an implementation by Ian Soboroff
ndcg: %ld %ld %3.1f %6.4f %3.1f %6.4f
ndcg_cut    Normalized Discounted Cumulative Gain at cutoffs.
    Compute a traditional nDCG measure according to Jarvelin and
    Kekalainen (ACM ToIS v. 20, pp. 422-446, 2002) at cutoffs.
    See comments for ndcg.
    Gain values are the relevance values in the qrels file.  For now, if you
    want different gains, change the qrels file appropriately.
    Cutoffs must be positive without duplicates
    Default params: -m ndcg_cut.5,10,15,20,30,100,200,500,1000
    Based on an implementation by Ian Soboroff
ndcg_cut: cutoff %ld dcg %6.4f
ndcg_cut:%ld %3.1f %6.4f
ndcg_cut: cutoff %ld idcg %6.4f
ndcg_cut:%ld %ld %3.1f %6.4f
ndcg_p    Normalized Discounted Cumulative Gain
    Compute a traditional nDCG measure according to Jarvelin and
    Kekalainen (ACM ToIS v. 20, pp. 422-446, 2002).
    Gain values are set to the appropriate relevance level by default.  
    The default gain can be overridden on the command line by having 
    comma separated parameters 'rel_level=gain'.
    Eg, 'trec_eval -m ndcg_p.1=3.5,2=9.0,4=7.0 ...'
    will give gains 3.5, 9.0, 3.0, 7.0 for relevance levels 1,2,3,4
    respectively (level 3 remains at the default).
    Gains are allowed to be 0 or negative, and relevance level 0
    can be given a gain.
    Based on an implementation by Ian Soboroff
ndcg_p:%ld %3.1f %6.4f
ndcg_p:%ld %ld %3.1f %6.4f
ndcg_rel    Normalized Discounted Cumulative Gain averaged over rel docs
    Experimental measure
    Compute a traditional nDCG measure according to Jarvelin and
    Kekalainen (ACM ToIS v. 20, pp. 422-446, 2002), averaged at rel docs.
    Idea behind ndcg_rel, is that the expected value of ndcg is a smoothly
    decreasing function, with discontinuities upward at each transistion
    between positive gain levels in the ideal ndcg.  Once the gain level 
    becomes 0, the expected value of ndcg then increases until all rel docs are
    retrieved. Thus averaging ndcg is problematic, because these transistions
    occur at different points for each topic.  Since it is not unusual for
    ndcg to start off near 1.0, decrease to 0.25, and then increase to 0.75
    at various cutoffs, the points at which ndcg is measured are important.
    This version averages ndcg over each relevant doc, where relevant is
    defined as expected gain > 0.  If a rel doc is not retrieved, then
    ndcg for the doc is the dcg at the end of the retrieval / ideal dcg. 
    
    Gain values are set to the appropriate relevance level by default.  
    The default gain can be overridden on the command line by having 
    comma separated parameters 'rel_level=gain'.
    Eg, 'trec_eval -m ndcg_rel.1=3.5,2=9.0,4=7.0 ...'
    will give gains 3.5, 9.0, 3.0, 7.0 for relevance levels 1,2,3,4
    respectively (level 3 remains at the default).
    Gains are allowed to be 0 or negative, and relevance level 0
    can be given a gain.
ndcg_rel: %ld %ld %3.1f %6.4f %3.1f %6.4f %6.4f
ndcg_rel: %ld %ld %3.1f %6.4f %3.1f %6.4f
ndcg_rel: %ld %ld %6.4f %6.4f %6.4f
num_nonrel_judged_ret    Number of non-relevant judged documents retrieved for topic. 
    Not an evaluation number per se, but gives details of retrieval results.
    Summary figure is sum of individual topics, not average.
num_q    Number of topics results averaged over.  May be different from
    number of topics in the results file if -c was used on the command line 
    in which case number of topics in the rel_info file is used.
num_rel    Number of relevant documents for topic. 
    May be affected by Judged_docs_only and Max_retrieved_per_topic command
    line parameters (as are most measures).
    Summary figure is sum of individual topics, not average.
qrelsqrels_jgtrec_eval: m_num_rel: rel_info format not qrels or qrels_jg
num_rel_ret    Number of relevant documents retrieved for topic. 
    May be affected by Judged_docs_only and Max_retrieved_per_topic command
    line parameters (as are most measures).
    Summary figure is sum of individual topics, not average.
num_ret    Number of documents retrieved for topic. 
    May be affected by Judged_docs_only and Max_retrieved_per_topic command
    line parameters (as are most measures).
    Summary figure is sum of individual topics, not average.
P    Precision at cutoffs
    Precision measured at various doc level cutoffs in the ranking.
    If the cutoff is larger than the number of docs retrieved, then
    it is assumed nonrelevant docs fill in the rest.  Eg, if a method
    retrieves 15 docs of which 4 are relevant, then P20 is 0.2 (4/20).
    Precision is a very nice user oriented measure, and a good comparison
    number for a single topic, but it does not average well. For example,
    P20 has very different expected characteristics if there 300
    total relevant docs for a topic as opposed to 10.
    Note:   trec_eval -m P.50 ...
    is different from 
            trec_eval -M 50 -m set_P ...
    in that the latter will not fill in with nonrel docs if less than 50
    docs retrieved
    Cutoffs must be positive without duplicates
    Default param: -m P.5,10,15,20,30,100,200,500,1000
P_avgjg    Precision at cutoffs, averaged over judgment groups (users)
    Precision measured at various doc level cutoffs in the ranking.
    If the cutoff is larger than the number of docs retrieved, then
    it is assumed nonrelevant docs fill in the rest.  Eg, if a method
    retrieves 15 docs of which 4 are relevant, then P20 is 0.2 (4/20).
    If there are multiple relevance judgment sets for this query, Precision
    is averaged over the judgment groups.
    Cutoffs must be positive without duplicates
    Default param: trec_eval -m P.5,10,15,20,30,100,200,500,1000
prefs_avgjg    Simple ratio of preferences fulfilled to preferences possible
    within a judgment group, averaged over jgs.  I.e., rather than considering
    all preferences equal (prefs_simp), consider all judgment groups equal.
    prefs_avgjg = AVERAGE_OVER_JG (fulfilled_jg / possible_jg);
    May be useful in applications where user satisfaction is represented
    by a jg per user, and it is not desirable for many preferences expressed
    by user1 to swamp a few preferences by user2.
    For doc pref A>B, this includes implied preferences (only one of A or B
    retrieved), and counts as failure if neither A nor B retrieved.
    Assumes '-R prefs' or '-R qrels_prefs'
prefs_avgjg_imp    Simple ratio of preferences fulfilled to preferences possible
    within a judgment group, averaged over jgs.  I.e., rather than considering
    all preferences equal (prefs_simp), consider all judgment groups equal.
    prefs_avgjg = AVERAGE_OVER_JG (fulfilled_jg / possible_jg);
    May be useful in applications where user satisfaction is represented
    by a jg per user, and it is not desirable for many preferences expressed
    by user1 to swamp a few preferences by user2.
    For doc pref A>B, this includes implied preferences (only one of A or B
    retrieved), but ignores pair if neither A nor B retrieved.
    pref_*_imp measures don't have any preferred applications that I know of,
    but some people like them.
    Assumes '-R prefs' or '-R qrels_prefs'
prefs_avgjg_ret    Simple ratio of preferences fulfilled to preferences possible
    within a judgment group, averaged over jgs.  I.e., rather than considering
    all preferences equal (prefs_simp), consider all judgment groups equal.
    prefs_avgjg = AVERAGE_OVER_JG (fulfilled_jg / possible_jg);
    May be useful in applications where user satisfaction is represented
    by a jg per user, and it is not desirable for many preferences expressed
    by user1 to swamp a few preferences by user2.
    For doc pref A>B, A and B must both be retrieved to be counted as either
    fulfilled or possible.
    pref_*_ret measures should be used for dynamic collections but are
    inferior in most other applications.
    Assumes '-R prefs' or '-R qrels_prefs'
prefs_avgjg_Rnonrel    Ratio of preferences fulfilled to preferences possible within a
    judgment group, averaged over jgs, except that the number of
    nonrelevant retrieved docs (rel_level == 0.0) in each jg is set to
    R, the number of relevant retrieved docs (rel_level > 0.0) in that jg.
    
    This addresses the general problem that the number of
    nonrelevant docs judged for a topic can be critical to fair
    evaluation - adding a couple of hundred preferences involving
    nonrelevant docs (out of the possibly millions in a collection) can
    both change the importance of the topic when averaging and even
    change whether system A scores better than system B (even given
    identical retrieval on the added nonrel docs).
    
    This measure conceptually sets the number of nonrelevant retrieved
    docs of a jg to R. If the actual number, N, is less than R, then R
    * (R-N) fulfilled preferences are added.  If N is greater than R,
    then only the first R (rank order) docs in the single ec with
    rel_level = 0.0 are used and the number of preferences are
    recalculated.  
    If there is a single jg with two equivalence classes (one of them 0.0), 
    then prefs_avgjg_Rnonrel is akin to the ranked measure bpref.
    Assumes '-R prefs' or '-R qrels_prefs'
prefs_avgjg_Rnonrel_ret    Ratio of preferences fulfilled to preferences possible within a
    judgment group, averaged over jgs, except that the number of
    nonrelevant retrieved docs (rel_level == 0.0) in each jg is set to
    R, the number of relevant retrieved docs (rel_level > 0.0) in that jg.
    
    This addresses the general problem that the number of
    nonrelevant docs judged for a topic can be critical to fair
    evaluation - adding a couple of hundred preferences involving
    nonrelevant docs (out of the possibly millions in a collection) can
    both change the importance of the topic when averaging and even
    change whether system A scores better than system B (even given
    identical retrieval on the added nonrel docs).
    
    This measure conceptually sets the number of nonrelevant retrieved
    docs of a jg to R. If the actual number, N, is less than R, then R
    * (R-N) fulfilled preferences are added.  If N is greater than R,
    then only the first R (rank order) docs in the single ec with
    rel_level = 0.0 are used and the number of preferences are
    recalculated.  
    If there is a single jg with two equivalence classes (one of them 0.0), 
    then prefs_avgjg_Rnonrel is akin to the ranked measure bpref.
    For doc pref A>B, A and B must both be retrieved to be counted as either
    fulfilled or possible.
    pref_*_ret measures should be used for dynamic collections but are
    inferior in most other applications.
    Assumes '-R prefs' or '-R qrels_prefs'
prefs_num_prefs_ful    Number of prefs fulfilled
    For doc pref A>B, this includes implied preferences (only one of A or B
    retrieved), and counts as failure if neither A nor B retrieved.
    Summary figure is sum of individual topics, not average.
prefs_num_prefs_ful_ret    Number of prefs fulfilled among retrieved docs
    For doc pref A>B, both A nd B must be retrieved to be counted.
    Summary figure is sum of individual topics, not average.
prefs_num_prefs_poss    Number of possible prefs independent of whether documents retrieved
    Summary figure is sum of individual topics, not average.
prefs_pair   Average over doc pairs of preference ratio for that pair.
    If a doc pair satisfies 3 preferences but fails 2 preferences (preferences
    from 5 different users),  then the score for doc pair is 3/5.
    Same as prefs_simp if there are no doc_pairs in multiple judgment groups.
    For doc pref A>B, this includes implied preferences (only one of A or B
    retrieved), and counts as failure if neither A nor B retrieved.
    Assumes '-R prefs' or '-R qrels_prefs'
prefs_pair_imp   Average over doc pairs of preference ratio for that pair.
    If a doc pair satisfies 3 preferences but fails 2 preferences (preferences
    from 5 different users),  then the score for doc pair is 3/5.
    Same as prefs_simp if there are no doc_pairs in multiple judgment groups.
    For doc pref A>B, this includes implied preferences (only one of A or B
    retrieved), but ignores pair if neither A nor B retrieved.
    pref_*_imp measures don't have any preferred applications that I know of,
    but some people like them.
    Assumes '-R prefs' or '-R qrels_prefs'
prefs_pair_ret   Average over doc pairs of preference ratio for that pair.
    If a doc pair satisfies 3 preferences but fails 2 preferences (preferences
    from 5 different users),  then the score for doc pair is 3/5.
    Same as prefs_simp if there are no doc_pairs in multiple judgment groups.
    For doc pref A>B, A and B must both be retrieved to be counted as either
    fulfilled or possible.
    For doc pref A>B, this includes implied preferences (only one of A or B
    retrieved), and counts as failure if neither A nor B retrieved.
    pref_*_ret measures should be used for dynamic collections but are
    inferior in most other applications.
    Assumes '-R prefs' or '-R qrels_prefs'
prefs_simp    Simple ratio of preferences fulfilled to preferences possible.
    If a doc pair satisfies two preferences, both are counted.
    If preferences are conflicted for a doc pair, all are counted
    (and thus max possible score may be less than 1.0 for topic).
    For doc pref A>B, this includes implied preferences (only one of A or B
    retrieved), and counts as failure if neither A nor B retrieved.
    Assumes '-R prefs' or '-R qrels_prefs'
prefs_simp_imp    Simple ratio of preferences fulfilled to preferences possible.
    If a doc pair satisfies two preferences, both are counted.
    If preferences are conflicted for a doc pair, all are counted
    (and thus max possible score may be less than 1.0 for topic).
    For doc pref A>B, this includes implied preferences (only one of A or B
    retrieved), but ignores pair if neither A nor B retrieved.
    pref_*_imp measures don't have any preferred applications that I know of,
    but some people like them.
    Assumes '-R prefs' or '-R qrels_prefs'
prefs_simp_ret    Simple ratio of preferences fulfilled to preferences possible among.
    the retrieved docs. 
    If a doc pair satisfies two preferences, both are counted.
    If preferences are conflicted for a doc pair, all are counted
    (and thus max possible score may be less than 1.0 for topic).
    For doc pref A>B, A and B must both be retrieved to be counted as either
    fulfilled or possible.
    pref_*_ret measures should be used for dynamic collections but are
    inferior in most other applications.
    Assumes '-R prefs' or '-R qrels_prefs'
recall    Recall at cutoffs
    Recall (relevant retrieved / relevant) measured at various doc level
    cutoffs in the ranking. If the cutoff is larger than the number of docs
    retrieved, then it is assumed nonrelevant docs fill in the rest.
    REcall is a fine single topic measure, but does not average well.
    Cutoffs must be positive without duplicates
    Default param: -m recall.5,10,15,20,30,100,200,500,1000
recip_rank    Reciprocal Rank of the first relevant retrieved doc.
    Measure is most useful for tasks in which there is only one relevant
    doc, or the user only wants one relevant doc.
relative_P    Relative Precision at cutoffs
    Precision at cutoff relative to the maximum possible precision at that
    cutoff.  Equivalent to Precision up until R, and then recall after R
    Cutoffs must be positive without duplicates
    Default params: -m relative_P.5,10,15,20,30,100,200,500,1000
relstring    The relevance values for the first N (default 10) retrieved docs
    are printed as a string, one character per relevance value for a doc.
    If the relevance value is between 0 and 9, it is printed.
    If the value is > 9,  '>' is printed.
    If the document was not in the pool to be judged, '-' is printed.
    if the document was in the pool, but unjudged (eg, infAP),  '.' is printed
    if the document has some other relevance value, '<' is printed.
    Measure is only printed for individual queries.
    Default usage:  -m relstring.10 
%-22s	%s	'%s'
Rndcg    Normalized Discounted Cumulative Gain at R levels
    Experimental measure
    Compute a traditional nDCG measure according to Jarvelin and
    Kekalainen (ACM ToIS v. 20, pp. 422-446, 2002), averaged at the various
    R level points. The R levels are the number of docs at each non-negative
    gain level in the judgments, with the gain levels sorted in decreasing
    order. Thus if there are 5 docs with gain_level 3, 3 with gain 2, 10
    with gain 1, and 50 with gain 0, then 
    Rndcg = 1/4 (ndcg_at_5 + ndcg_at_8 + ndcg_at_18 + ndcg_at_68).
    In this formulation, all unjudged docs have gain 0.0, and thus there is
    a final implied R-level change at num_retrieved.
    Idea behind Rndcg, is that the expected value of ndcg is a smoothly
    decreasing function, with discontinuities upward at each transistion
    between positive gain levels in the ideal ndcg.  Once the gain level 
    becomes 0, the expected value of ndcg then increases until all docs are
    retrieved. Thus averaging ndcg is problematic, because these transistions
    occur at different points for each topic.  Since it is not unusual for
    ndcg to start off near 1.0, decrease to 0.25, and then increase to 0.75
    at various cutoffs, the points at which ndcg is measured are important.
    
    Gain values are set to the appropriate relevance level by default.  
    The default gain can be overridden on the command line by having 
    comma separated parameters 'rel_level=gain'.
    Eg, 'trec_eval -m Rndcg.1=3.5,2=9.0,4=7.0 ...'
    will give gains 3.5, 9.0, 3.0, 7.0 for relevance levels 1,2,3,4
    respectively (level 3 remains at the default).
    Gains are allowed to be 0 or negative, and relevance level 0
    can be given a gain.
Rndcg: %ld %ld %3.1f %6.4f %3.1f %6.4f %6.4f
Rndcg: %ld %ld %3.1f %6.4f %3.1f %6.4f
Rprec    Precision after R documents have been retrieved.
    R is the total number of relevant docs for the topic.  
    This is a good single point measure for an entire retrieval
    ranking that averages well since each topic is being averaged
    at an equivalent point in its result ranking.
    Note that this is the point that Precision = Recall.
    History: Originally developed for IR rankings by Chris Buckley
    after TREC 1, but analogs were used in other disciplines previously.
    (the point where P = R is an important one!)
    Cite: 'Retrieval System Evaluation', Chris Buckley and Ellen Voorhees.
    Chapter 3 in TREC: Experiment and Evaluation in Information Retrieval
    edited by Ellen Voorhees and Donna Harman.  MIT Press 2005
Rprec_mult    Precision measured at multiples of R (num_rel).
    This is an attempt to measure topics at the same multiple milestones
    in a retrieval (see explanation of R-prec), in order to determine
    whether methods are precision oriented or recall oriented.  If method A
    dominates method B at the low multiples but performs less well at the
    high multiples then it is precision oriented (compared to B).
    Default param: -m Rprec_mult.0.2,0.4,0.6,0.8,1.0,1.2,1.4,1.6,1.8,2.0 ...
????????Rprec_mult_avgjg    Precision measured at multiples of R(num_rel) averged over users.
    This is an attempt to measure topics at the same multiple milestones
    in a retrieval (see explanation of R-prec), in order to determine
    whether methods are precision oriented or recall oriented.  If method A
    dominates method B at the low multiples but performs less well at the
    high multiples then it is precision oriented (compared to B).
    If there is more than one judgment group (set of evalutation judgments
    of a user), then the measure is averaged over those jgs.
    Default param: 
    trec_eval -m Rprec_mult_avgjg.0.2,0.4,0.6,0.8,1.0,1.2,1.4,1.6,1.8,2.0  ...
????????runid    Runid given by results input file.
%-22s	%s	%s
set_F      Set F measure: weighted harmonic mean of recall and precision
    set_Fx = (x+1) * P * R / (R + x*P)
    where x is the relative importance of R to P (default 1.0).
    Default usage: trec_eval -m set_F.1.0 ...
    Cite: Variant of van Rijsbergen's E measure ('Information Retrieval',
    Butterworths, 1979).
set_map    Set map: num_relevant_retrieved**2 / (num_retrieved*num_rel)
    Unranked set map, where the precision due to all relevant retrieved docs
    is the set precision, and the precision due to all relevant not-retrieved
    docs is set to 0.
    Was known as exact_unranked_avg_prec in earlier versions of trec_eval.
    Another way of loooking at this is  Recall * Precision on the set of
    docs retrieved for a topic.
set_P    Set Precision: num_relevant_retrieved / num_retrieved 
    Precision over all docs retrieved for a topic.
    Was known as exact_prec in earlier versions of trec_eval
    Note:   trec_eval -m P.50 ...
    is different from 
            trec_eval -M 50 -m set_P ...
    in that the latter will not fill in with nonrel docs if less than 
    50 docs retrieved
set_recall    Set Recall: num_relevant_retrieved / num_relevant 
    Recall over all docs retrieved for a topic.
    Was known as exact_recall in earlier versions of trec_evalset_relative_P    Relative Set Precision:  P / (Max possible P for this size set) 
    Relative precision over all docs retrieved for a topic.
    Was known as exact_relative_prec in earlier versions of trec_eval
    Note:   trec_eval -m relative_P.50 ...
    is different from 
            trec_eval -M 50 -m set_relative_P ...
success    Success at cutoffs
    Success (a relevant doc has been retrieved) measured at various doc level
    cutoffs in the ranking.
    If the cutoff is larger than the number of docs retrieved, then
    it is assumed nonrelevant docs fill in the rest.
    Cutoffs must be positive without duplicates
    Default param: trec_eval -m success.1,5,10
    History: Developed by Stephen Tomlinson.
utility    Set utility measure
    Set evaluation based on contingency table:
                        relevant  nonrelevant
       retrieved            a          b
       nonretrieved         c          d
    where  utility = p1 * a + p2 * b + p3 * c + p4 * d
    and p1-4 are parameters (given on command line in that order).
    Conceptually, each retrieved relevant doc is worth something positive to
    a user, each retrieved nonrelevant doc has a negative worth, each 
    relevant doc not retrieved may have a negative worth, and each
    nonrelevant doc not retrieved may have a (small) positive worth.
    The overall measure is simply a weighted sum of these values.
    If p4 is non-zero, then '-N num_docs_in_coll' may also be needed - the
    standard results and rel_info files do not contain that information.
    Default usage: -m utility.1.0,-1.0,0.0,0.0 ...
    Warning: Current version summary evaluation averages over all topics;
    it could be argued that simply summing is more useful (but not backward
    compatible)
trec_eval.calc_utility: improper number of coefficients
yaap    Yet Another Average Precision
    Adaptation of MAP proposed by Stephen Robertson to get a value
    that is more globally averagable than MAP.  Should be monotonic with
    MAP on a single topic, but handles extreme values better.
    log ((1 + sum_probrel)  /  (1 + num_rel - sum_probrel))
    where sum_probrel = sum over all rels of (numrel_before_it / current rank)
    Cite: 'On Smoothing Average Precision', Stephen Robertson.
    ECIR 2012, LNCS 7224, pp.158-169.  2012.
    Edited by R.Baeza-Yates et al. Springer-Verlag Berlin
[??*?'?trec_eval: Negative cutoff detected
trec_eval: duplicate cutoffs detected
trec_eval: malformed pair parameters in '%s'
%s_%ld%s_%3.2f%s_%sZ%s%-22s	%s	%6.4f
Z%-22s	%s	%6.4f
%-22s	%s	%ld
Z%-22s	%s	%6.4f
%-22s	%s	%6.4f
%-22s	%s	%ld
runidnum_qnum_retnum_relnum_rel_retmapgm_mapRprecbprefrecip_rankiprec_at_recallPrelstringrecallinfAPgm_bprefutility11pt_avgndcgrelative_PRprec_multsuccessmap_cutndcg_cutndcg_relRndcgbinGGset_Pset_recallset_relative_Pset_mapset_Fnum_nonrel_judged_retprefs_num_prefs_possprefs_num_prefs_fulprefs_num_prefs_ful_retprefs_simpprefs_pairprefs_avgjgprefs_avgjg_Rnonrelprefs_simp_retprefs_pair_retprefs_avgjg_retprefs_avgjg_Rnonrel_retprefs_simp_impprefs_pair_impprefs_avgjg_impmap_avgjgP_avgjgRprec_mult_avgjgofficialsetall_trecall_prefsprefsqrels_jg9.0.4trec_eval [-h] [-q] [-m measure[.params] [-c] [-n] [-l ]
   [-D debug_level] [-N ] [-M ] [-R rel_format] [-T results_format]
   rel_info_file  results_file 
 
Calculate and print various evaluation measures, evaluating the results  
in results_file against the relevance info in rel_info_file. 
 
There are a fair number of options, of which only the lower case options are 
normally ever used.   
 --help:
 -h: Print full help message and exit. Full help message will include
     descriptions for any measures designated by a '-m' parameter, and
     input file format descriptions for any rel_info_format given by '-R'
     and any top results_format given by '-T.'
     Thus to see all info about preference measures use
          trec_eval -h -m all_prefs -R prefs -T trec_results 
 --version:
 -v: Print version of trec_eval and exit.
 --query_eval_wanted:
 -q: In addition to summary evaluation, give evaluation for each query/topic
 --measure measure_name[.measure_params]:
 -m measure: Add 'measure' to the lists of measures to calculate and print.
    If 'measure' contains a '.', then the name of the measure is everything
    preceeding the period, and everything to the right of the period is
    assumed to be a list of parameters for the measure, separated by ','. 
    There can be multiple occurrences of the -m flag.
    'measure' can also be a nickname for a set of measures. Current 
    nicknames include 
       'official': the main measures often used by TREC
       'all_trec': all measures calculated with the standard TREC
                   results and rel_info format files.
       'set': subset of all_trec that calculates unranked values.
       'prefs': Measures not in all_trec that calculate preference measures.
 --complete_rel_info_wanted:
 -c: Average over the complete set of queries in the relevance judgements  
     instead of the queries in the intersection of relevance judgements 
     and results.  Missing queries will contribute a value of 0 to all 
     evaluation measures (which may or may not be reasonable for a  
     particular evaluation measure, but is reasonable for standard TREC 
     measures.) Default is off.
 --level_for_rel num:
 -l: Num indicates the minimum relevance judgement value needed for 
      a document to be called relevant. Used if rel_info_file contains 
      relevance judged on a multi-relevance scale.  Default is 1. 
 --nosummary:
 -n: No summary evaluation will be printed
 --Debug_level num:
 -D : Debug level.  1 and 2 used for measures, 3 and 4 for merging
     rel_info and results, 5 and 6 for input.  Currently, num can be of the
     form . and only qid will be evaluated with debug info printed.
     Default is 0.
 --Number_docs_in_coll num:
 -N : Number of docs in collection Default is MAX_LONG 
 -Max_retrieved_per_topic num:
 -M : Max number of docs per topic to use in evaluation (discard rest). 
      Default is MAX_LONG.
 --Judged_docs_only:
 -J: Calculate all values only over the judged (either relevant or  
     nonrelevant) documents.  All unjudged documents are removed from the 
     retrieved set before any calculations (possibly leaving an empty set). 
     DO NOT USE, unless you really know what you're doing - very easy to get 
     reasonable looking numbers in a file that you will later forget were 
     calculated  with the -J flag.  
 --Rel_info_format format:
 -R format: The rel_info file is assumed to be in format 'format'.  Current
    values for 'format' include 'qrels', 'prefs', 'qrels_prefs'.  Note not
    all measures can be calculated with all formats.
 --Results_format format:
 -T format: the top results_file is assumed to be in format 'format'. Current
    values for 'format' include 'trec_results'. Note not all measures can be
    calculated with all formats.
 --Zscore Zmean_file:
 -Z Zmean_file: Instead of printing the raw score for each measure, print
    a Z score instead. The score printed will be the deviation from the mean
    of the raw score, expressed in standard deviations, where the mean and
    standard deviation for each measure and query are found in Zmean_file.
    If mean is not in Zmeanfile for a measure and query, -1000000 is printed.
    Zmean_file format is ascii lines of form 
       qid  measure_name  mean  std_dev
 
 
Standard evaluation procedure:
For each of the standard TREC measures requested, a ranked list of
of relevance judgements is created corresponding to each ranked retrieved doc,
A rel judgement is set to -1 if the document was not in the pool (not in 
rel_info_file) or -2 if the document was in the pool but unjudged (some 
measures (infAP) allow the pool to be sampled instead of judged fully).  
Otherwise it is set to the value in rel_info_file. 
Most measures, but not all, will treat -1 or -2 the same as 0, 
namely nonrelevant.  Note that relevance_level is used to 
determine if the document is relevant during score calculations. 
Queries for which there is no relevance information are ignored. 
Warning: queries for which there are relevant docs but no retrieved docs 
are also ignored by default.  This allows systems to evaluate over subsets  
of the relevant docs, but means if a system improperly retrieves no docs,  
it will not be detected.  Use the -c flag to avoid this behavior. 
Usage: trec_eval [-h] [-q] {-m measure}* trec_rel_file trec_top_file
   -h: Give full help information, including other options
   -q: In addition to summary evaluation, give evaluation for each query
   -m: calculate and print measures indicated by 'measure'
       ('-m all_qrels' prints all qrels measures, '-m official' is default)
qrelstrec_resultshvqm:cl:nD:JN:M:R:T:oZ:trec_eval version %s
trec_eval: illegal measure '%s'
trec_eval: Quit in file '%s'
trec_eval: Illegal rel_format '%s'
trec_eval: Illegal retrieval results format '%s'
officialtrec_eval: illegal measure 'official'
alltrec_eval: Cannot initialize measure '%s'
trec_eval: Can't calculate measure '%s'
trec_eval: Can't accumulate measure '%s'
trec_eval: Can't print query measure '%s'
trec_eval: No queries with both results and relevance info
trec_eval: Can't print measure '%s'
trec_eval: cleanup failed
MA?A?A?A?A?AiA?A?A?AyA?A?A?A?A?A?A?A?A?A?A?A?A?A?A?A?A?A?A?A?AA?A?A?A?AuA?A?A?A(A?A@A?A?A?A?A?A?A?A?Atrec_eval: improper measure in parameter '%s'
%s
-----------------------
Individual measure documentation for requested measures%s
%s-- No measures indicated.
   Request measure documentation using <-m measure> on command linehelpversionquery_eval_wantedmeasurecomplete_rel_info_wantedlevel_for_relnosummaryDebug_levelJudged_docs_onlyNumber_docs_in_collMax_retrieved_per_topicRel_info_formatResults_formatOutput_old_results_formatZscoreGCC: (GNU) 4.9.3 20150626 (Fedora Cygwin 4.9.3-1)GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0GCC: (GNU) 4.9.3 20150626 (Fedora Cygwin 4.9.3-1)GCC: (GNU) 4.9.3 20150626 (Fedora Cygwin 4.9.3-1)GCC: (GNU) 4.9.3 20150626 (Fedora Cygwin 4.9.3-1)GCC: (GNU) 4.9.3 20150626 (Fedora Cygwin 4.9.3-1)GCC: (GNU) 4.9.3 20150626 (Fedora Cygwin 4.9.3-1)GCC: (GNU) 4.9.3 20150626 (Fedora Cygwin 4.9.3-1)GCC: (GNU) 4.9.3 20150626 (Fedora Cygwin 4.9.3-1)GCC: (GNU) 4.9.3 20150626 (Fedora Cygwin 4.9.3-1)GCC: (GNU) 4.9.3 20150626 (Fedora Cygwin 4.9.3-1)GCC: (GNU) 4.9.3 20150626 (Fedora Cygwin 4.9.3-1)GCC: (GNU) 5.3.0GCC: (GNU) 5.3.0 ?RSDS!??q?'$?q????ĂzR|??????A?B
??zR|?,8???A?B
F????
?A?A?A?JL???/A?B
k?zR|????vA?B
r?<f????A?B
??\?????A?B
??zR|?(T???e
A?B
E??Z
?A?A?$H????6A?B
D?-?A?(p?????A?B
E????A?A?$?2????A?B
D???A?$????2A?B
D?)?A??????A?B
??????A?B
?$,????<A?B
D?3?A?T???oA?B
k?t??tA?B
p??c?? A?B
\??c???A?B
??????RA?B
N????A?B
Z?	??A?B
Q?$4????A?B
E????A?A?\????A?B
??|????A?B
??????A?B
??????A?B
??????-A?B
)???
???A?B
~?
???A?B
??zR|?$????A?B
D???A?DM??uA?B
q?d???RA?B
N?????A?B
Z??????A?B
??zR|?$4??P	A?B
D?G	?A?D\??uA?B
q?d???RA?B
N?????A?B
Z??????A?B
??zR|?(l???A?B
E????A?A?H?#??DA?B
@?h"$??A?B
???'???A?B
??zR|?$X'??_A?B
D?V?A?D?+??DA?B
@?d?+??nA?B
j??.???A?B
??zR|?$X.???A?B
D???A?D?3??jA?B
f?dI4??~A?B
z???6???A?B
??zR|?$7??[A?B
D?R?A?DO;??DA?B
@?ds;??~A?B
z???=???A?B
??zR|?$(>??AA?B
D?8?A?DAC??DA?B
@?deC???A?B
????F???A?B
??zR|?$?F??&A?B
D??A?D?J??DA?B
@?d?J??~A?B
z??M??mA?B
i?zR|?LM??=A?B
9?zR|?TO???A?B
??zR|?P???A?B
~?zR|?XQ???A?B
??$<?T??A?B
D?	?A?d?V??/A?B
k???V??HA?B
D?zR|?$?V???A?B
D???A?zR|? ?X???A?B
D???A?zR|?A?B
:?$<p??A?B
D?	?A?dr??/A?B
k??r??HA?B
D?zR|?$r??wA?B
s?zR|?dr?? A?B
\?<dr???A?B
{?zR|?lr??VA?B
R?<?r???A?B
??zR|?t??VA?B
R?zR|?4t??VA?B
R?zR|?Tt??A?B
?zR|?$8u???A?B
D???A?zR|??v??$A?B
 ?zR|??w??
A?B
?zR|??x???A?B
??zR|?$@y??A?B
D??A?$D5{??2A?B
D?)?A?zR|?$(????A?B
D???A?$D?????A?B
D???A?zR|?P????A?B
??zR|??????A?B
??zR|?????A?B
??zR|??????A?B
??zR|?????A?B
?zR|?Ȋ??FA?B
B?zR|?؋??	A?B
?zR|??????A?B
??zR|?\????A?B
??zR|?؍??$A?B
 ?zR|?Ď???A?B
??zR|? ???VA?B
R?zR|?@???:A?B
6? <Z???HA?B
D??A?`~???uA?B
q?zR|?????:A?B
6?$<֕??A?B
D?	?A?d????/A?B
k??ϗ??HA?B
D?zR|??????A?B
??zR|?X???A?B

?zR|?$4???A?B
D??A?zR|????A?B
Q?<	???>A?B
z?zR|?????A?B
??zR|??????A?B
??zR|?؝??hA?B
d?zR|????hA?B
d?zR|?8????A?B
??zR|?????A?B
?zR|?\???
A?B
	?zR|? 4????A?B
D???A?zR|?Ġ??
A?B
F?<????HA?B
D? \֠??zA?B
D?q?A?zR|????
A?B
F?<????[A?B
W? \9????A?B
D?|?A? ??????A?B
D???A?zR|?????A?B
P?<????A?B
??\[????A?B
??$|Ƣ??wA?B
D?n?A?$????wA?B
D?n?A?$?d???WA?B
D?N?A?$?????WA?B
D?N?A?$§??A?B
D???A?$D?????A?B
D???A?$lR???>A?B
D?5?A?$?h????A?B
D???A?????A?B
O?????:A?B
v?????_A?B
[?[???kA?B
g? <????~A?B
D?u?A?zR|????
A?B
F?<Ү??uA?B
q? \'????A?B
D???A? ?Я???A?B
D???A???????A?B
??zR|?$???
A?B
F?<????A?B
?? \?????A?B
D???A? ?*????A?B
D???A?zR|?,̲???A?B
L???v?A?A?A? L%????A?B
D???A?p????gA?B
c???????A?B
???????lA?B
h?????[A?B
W??;???
A?B
	?zR|????[A?B
W?<K???yA?B
u?zR|????? C TC zR|?????zR|?t???zR|?X???zR|?realloc?snprintfstrcmpstrlenstrncpydFreeLibraryGetModuleHandleAEGetProcAddress)LoadLibraryAppppppppppppppppppppppppppppppppppppppppppppppppcygwin1.dllppppKERNEL32.dll?0?HX??

  
    
      
        
      
    
  
  
    
      
      
      
      
      
      
      
       
      
       
    
  

@???@???@p,~.?@@e	?7dJ@?
?A?H\U@??P?]@??W?e@d	.`?n@??g?v@	p?@Uw?@=??@?@?v?0?@?????@8.??@?V?̓@?)???@?K?D?@???D?@????@G??H?@I8???@????@?(?,?@?~??@?
԰@w?L?@_'??@?%??@V?.?@V,7\?@Y@x?@??Ih?@$?T??@
?_??@??jh?@Ow??@?z?H?@?S???@?8?p?@??0?@????@ǹ??@F?? ?@	s?,?@?L??@?%???@$f???@?"???@Vc??@??
??@?O??@?*D?@?(X?@?2x?@S?:??@?D|?@??L?@hXUl?@h^??@??fX?@ph?@
yx?@??D?@?F??@?N???@H????@?????@F|???$APs?tA???`A F??A}??A???A???A"??A,??A)? AD0AQ`A4?#?GNU C 4.9.3 20150626 (Fedora Cygwin 4.9.3-1) -mtune=generic -march=i686 -g -ggdb -O2 -fdebug-prefix-map=/home/corinna/src/cygwin/cygwin-2.4.1/cygwin-2.4.1-1.i686/build=/usr/src/debug/cygwin-2.4.1-1 -fdebug-prefix-map=/home/corinna/src/cygwin/cygwin-2.4.1/cygwin-2.4.1-1.i686/src/newlib-cygwin=/usr/src/debug/cygwin-2.4.1-1 -fno-common -fbuiltin -fmessage-length=0 -fmerge-constants -ftracer/usr/src/debug/cygwin-2.4.1-1/winsup/cygwin/crt0.c@?intsize_t??unsigned intshort unsigned intcharlong intlong long int?sizetypelong unsigned intunsigned charDWORD|Kfloatsigned charshort int`long long unsigned intULONG_PTR7KDWORD_PTR??__int32_t??__uint32_tA?_LOCK_Tq_off64_t"_fpos_t'_fpos64_t-_ssize_t7?wint_ta?J?__wchLY__wchbM?	`?
?G?__countI?__valueNh_mbstate_tO?_flock_tS
__ULongK
_Bigint-V_next/V_k0?_maxwds0?_sign0?_wds0?_x1\?	?l
?
__tm$5__tm_sec7?__tm_min8?__tm_hour9?__tm_mday:?__tm_mon;?__tm_year?__tm_isdst?? Ho_fnargsIo_dso_handleJo?_fntypesL?_is_cxaO?	q
?_atexit?[?_next\?_ind]?_fns_?`?	??
??
__sbufs_baset?_sizeu?I,,q3?2_reent@8?_errno:?_stdin?\
_stdout?\
_stderr?\
_incA?_emergencyB
_current_categoryD?0_current_localeE?4__sdidinitG?8__cleanupI
<_resultLV@_result_kM?D_p5sNVH_freelistO#
L_cvtlenR?P_cvtbufS3T_newv?X_atexitz?H_atexit0{L_sig_func4
?__sglue?
?__sf?@
?I?,q????)%,q)???,q+	`U
?	`e
?
__sFILE64p??	_p?_r??_w??_flags?_file?_bf??_lbfsize??_data?,_cookie?q _read??$_write?(_seek%,_close?0_ub?4_up?<_ur?@_ubuf	ED_nbuf
UG_lb
?H_blksize?P_flags2?T_offsetX_seek64
`_lock?d_mbstate?h8
,q8??	__FILEe_glue!V
_next#V
_niobs$?_iobs%\

	
_rand48=?
_seed>?
_mult??
_add@?	??

??W]_unused_randY?_strtok_lastZ3_asctime_buf[]_localtime_buf\l$_gamma_signgam]?H_rand_next^?P_r48_b
X_mblen_state`?h_mbtowc_statea?p_wctomb_stateb?x_l64a_bufcm?_signal_bufd}?_getdate_erre??_mbrlen_statef??_mbrtowc_stateg??_mbsrtowcs_stateh??_wcrtomb_statei??_wcsrtombs_statej??_h_errnok??	m
?	}
?	?
??p?_nextfs?_nmalloct?x	??
?	??
??U
_reentl?
_unusedu?	

?
,
V4
?:
)
		
P

?
HINSTANCE__?t
unused??HINSTANCE??
P
HMODULE?t
qdoubleint32_t	?uint32_t	 ?per_process?
-Ainitial_sp
/3magic_biscuit
3?
dll_major
4?
dll_minor
5?
impure_ptr_ptr
7Aenvptr
9Gmalloc
>bfree
?srealloc
@? fmode_ptr
B9$main
D?(ctors
E?,dtors
F?0data_start
Iq4data_end
Jq8bss_start
Kq<bss_end
Lq@calloc
N?Dpremain
P?Hrun_ctors_p
S?
Xunused
U\cxx_malloc
X+xhmodule
Z?
|api_major
\s?api_minor
]s?unused2
c1?posix_memalign
fZ?pseudo_reloc_start
hq?pseudo_reloc_end
iq?image_base
jq?threadinterface
oq?impure_ptr
q,?,M3qb?Ssqhq?q?y???MM??q????	??
????M??
	?
?per_process_cxx_malloc	?A
??Z?
??AmainCRTStartup@?? @?!t 1@B?"t0"t0"t0 M@j?"t0"t0"t0 i@??"t0"t0"t0#?@?"t0"t0"t0$cygwin_crt0((.?B?M%cygwin_premain0
}j?M?%cygwin_premain1
~??M?%cygwin_premain2
??M?&cygwin_premain3
??M???GNU C11 5.3.0 -mtune=generic -march=i686 -gconvert_zscores.c/cygdrive/h/Downloads/trec_eval-master/trec_eval-master?@?]signed charunsigned charshort intshort unsigned intintunsigned intlong long intlong long unsigned intlong doublecharlong intsizetypelong unsigned int#d#doublefloat8?name9Xvalue/?lj>/?hdebug_print_counts_arrayM?;@?(caM??iN/?ljN/?hdebug_print_jg^?<@-?ejg^??i_/?ldebug_print_results_prefs??>@???rp?
?i?/?lte_form_pref_counts_cleanup??E?@??
current_query?b A
max_current_query?/`B??/`B=?/ `B
num_jgs?/$`B
jgs??(`B
max_num_jgs?/,`B
rank_pool?=0`B
max_rank_pool?/4`B
ec_pool?Z8`B
max_ec_pool?/<`B
ca_pool??@`B
max_ca_pool?/D`B
ca_ptr_pool?H`B
max_ca_ptr_pool?/L`B
pa_pool?\P`B
max_pa_pool?/T`B
pa_ptr_pool??X`B
max_pa_ptr_pool?/\`B
rel_pool?```B
max_rel_pool?/d`B??5h`B
max_prefs_and_ranks?/l`B
docno_results??p`B
max_docno_results?/t`B
temp_pa_pool?\x`B
max_temp_pa_pool?/|`B
temp_pa_ptr_pool???`B
max_temp_pa_ptr_pool?/?`B
saved_num_judged?/?`BE	BGNU C11 5.3.0 -mtune=generic -march=i686 -gform_res_rels.c/cygdrive/h/Downloads/trec_eval-master/trec_eval-master?@@e	?
signed charunsigned charshort intshort unsigned intintunsigned intlong long intlong long unsigned intlong doublecharlong intsizetypelong unsigned int!doublefloat?measure_nameXparametersXMEAS_ARGq8query_flag+summary_flag+debug_level+debug_queryXrelation_flag+average_complete_flag +judged_docs_only_flag$+num_docs_in_coll'+relevance_level(+ max_num_docs_per_topic++$rel_info_format,X(results_format-X,zscore_flag.+0meas_arg34?EPI4?`[qidaXrun_idbXret_formatcXq_resultse)RESULTSfn?qidoXrel_formatpXq_rel_infoq)REL_INFOrj????[
?docnoXsimhTEXT_RESULTS?anum_text_results+max_num_text_results+text_resultsa?TEXT_RESULTS_INFO
?docnoXrel+TEXT_QRELS?num_text_qrels+max_num_text_qrels +text_qrels"?TEXT_QRELS_INFO$? E?num_rel_retG+num_retJ+num_nonpoolL+num_unjudged_in_poolN+num_relS+num_rel_levelsT+	?U?results_rel_listY?+RES_RELS] ?!"docno"Xsim#hrank$+rel%+DOCNO_INFO&?
te_form_res_rels9??@@??Fepi9??rel_info9??results:??res_rels:??i<+?lnum_results=+?hmax_rel>+?dtext_results_info@F?Xtrec_qrelsAL?Tqrels_ptrC?`end_qrelsC?P
?F@?rrl?+?\g	comp_rank_judged??H@u??ptr1??ptr2??"comp_sim_docno?ZI@R??ptr1??ptr2??comp_docno ??I@?(ptr1!??ptr2"??te_form_res_rels_cleanup)??I@??current_query)X Amax_current_query*+?`B?-??`Bmax_rel_levels.+?`Bsaved_res_rels/??`Branked_rel_list0??`Bmax_ranked_rel_list1+?`Bdocno_info4??`Bmax_docno_info5+?`B
YGNU C11 5.3.0 -mtune=generic -march=i686 -gform_res_rels_jg.c/cygdrive/h/Downloads/trec_eval-master/trec_eval-masterdJ@?
?signed charunsigned charshort intshort unsigned intintunsigned intlong long intlong long unsigned intlong doublecharlong intsizetypelong unsigned int$doublefloat?measure_name[parameters[MEAS_ARGt8query_flag.summary_flag.debug_level.debug_query[relation_flag.average_complete_flag .judged_docs_only_flag$.num_docs_in_coll'.relevance_level(. max_num_docs_per_topic+.$rel_info_format,[(results_format-[,zscore_flag..0meas_arg34?EPI4?`^qida[run_idb[ret_formatc[q_resultse,RESULTSfn?qido[rel_formatp[q_rel_infoq,REL_INFOrm?
???^
?docno[simkTEXT_RESULTS?dnum_text_results.max_num_text_results.text_resultsd?TEXT_RESULTS_INFO?docno[rel.TEXT_QRELS??&?num_text_qrels'.text_qrels(?TEXT_QRELS_JG*?,?num_text_qrels_jg-.text_qrels_jg.??TEXT_QRELS_JG_INFO/ E	num_rel_retG.num_retJ.num_nonpoolL.num_unjudged_in_poolN.num_relS.num_rel_levelsT.	?U	results_rel_listY	.RES_RELS]_`Pqida[num_jgsb.jgscPRES_RELS_JGd"?docno#[sim$krank%.rel&.DOCNO_INFO'i
te_form_res_rels_jg9?dJ@P	??epi9??rel_info9??results:??res_rels:??i<.?lnum_results=.?hjg>.?dtext_results_info@??Ttrec_qrelsA??Pqrels_ptrC??`end_qrelsC??Lmax_relD.?\rel_level_ptrF	?H
?P@rrl?.?XVjEcomp_rank_judged??S@u?Kptr1K?ptr2K??comp_sim_docno%?)T@R??ptr1&K?ptr2'K?comp_docno1?{T@??ptr12K?ptr23K?te_form_res_rels_jg_cleanup9??T@??current_query*[ Amax_current_query+.aB?.	aBmax_rel_levels/.aBjgs0PaBnum_jgs1.aBmax_num_jgs2.aBranked_rel_list3	aBmax_ranked_rel_list4.aBdocno_info5K aBmax_docno_info6.$aB?pGNU C11 5.3.0 -mtune=generic -march=i686 -gformats.c/cygdrive/h/Downloads/trec_eval-master/trec_eval-master?signed charunsigned charshort intshort unsigned intintunsigned intlong long intlong long unsigned intlong doublecharlong intsizetypelong unsigned intdoublefloat?measure_nameJparametersJMEAS_ARGc8?query_flagsummary_flagdebug_leveldebug_queryJrelation_flagaverage_complete_flag judged_docs_only_flag$num_docs_in_coll'relevance_level( max_num_docs_per_topic+$?,J(results_format-J,zscore_flag.0meas_arg3?4?EPI4?`AqidaJrun_idbJret_formatcJq_resultseRESULTSf?h?num_q_resultsimax_num_q_resultsjresultsk?AALL_RESULTSlPn?qidoJrel_formatpJq_rel_infoqREL_INFOr?tAnum_q_relsumax_num_q_relsvrel_infowA?ALL_REL_INFOx????name?J??J??????	??
[
J
?G????REL_INFO_FILE_FORMAT?a?'name?J??J??F???	?@
[
J
@?'RESULTS_FILE_FORMAT??????Jresults_info_format?J??J???FORM_INTER_PROCS?g
??)te_rel_info_format?  Ate_num_rel_info_format??` A
L,)te_results_format?d Ate_num_results_format??t A
?~)te_form_inter_procsn? Ate_num_form_inter_procs?? A?'GNU C11 5.3.0 -mtune=generic -march=i686 -gget_prefs.c/cygdrive/h/Downloads/trec_eval-master/trec_eval-master\U@??signed charunsigned charshort intshort unsigned intintunsigned intlong long intlong long unsigned intlong doublecharlong intsizetypelong unsigned intdoublefloat?measure_nameTparametersTMEAS_ARGm8?query_flag'summary_flag'debug_level'debug_queryTrelation_flag'average_complete_flag 'judged_docs_only_flag$'num_docs_in_coll''relevance_level(' max_num_docs_per_topic+'$rel_info_format,T(results_format-T,zscore_flag.'0meas_arg3?4?EPI4?nIqidoTrel_formatpTq_rel_infoq%REL_INFOrt?num_q_relsu'max_num_q_relsv'rel_infow?IALL_REL_INFOxYT?2	jg3Tjsg4Trel_level5ddocno6TTEXT_PREFS7?9inum_text_prefs:'max_num_text_prefs;'text_prefs?qid?Wjg@WdocnoAWrelBWLINESC?te_get_qrels_jgT??e@??s	epiT??	text_qrels_fileTW?	all_rel_infoT??
fdV???
sizeW??l
ptrXW??
current_qidYW?h
current_jgYW?d
iZ*?`
lines[s??
line_ptr\s?\
num_lines]*?X
num_qid^*?T
num_jg^*?P
rel_info_ptr`??L
text_jg_info_ptray?H
text_jg_ptrb??D
text_qrels_ptrc?@??comp_lines_qid_jg_docno??[k@j??	ptr1?s?	ptr2?s?
cmp???lparse_qrels_line???k@~?k	start_ptr???	qid_ptr???	jg_ptr???	docno_ptr???	rel_ptr???
ptr?W?t
te_get_qrels_jg_cleanup?Cn@??
trec_qrels_bufMWHaB
text_jg_info_poolNyLaB
text_jg_poolO?PaB
text_qrels_poolPTaB
rel_info_poolQ?XaB__ctype_ptr__-W?
GNU C11 5.3.0 -mtune=generic -march=i686 -gget_qrels_prefs.c/cygdrive/h/Downloads/trec_eval-master/trec_eval-master?n@??signed charunsigned charshort intshort unsigned intintunsigned intlong long intlong long unsigned intlong doublecharlong intsizetypelong unsigned int#doublefloat?measure_nameZparametersZMEAS_ARGs8query_flag-summary_flag-debug_level-debug_queryZrelation_flag-average_complete_flag -judged_docs_only_flag$-num_docs_in_coll'-relevance_level(- max_num_docs_per_topic+-$rel_info_format,Z(results_format-Z,zscore_flag.-0meas_arg34?EPI4?nOqidoZrel_formatpZq_rel_infoq+REL_INFOrt?num_q_relsu-max_num_q_relsv-rel_infow?OALL_REL_INFOx_	Z?2jg3Zjsg4Zrel_level5jdocno6ZTEXT_PREFS7?9onum_text_prefs:-max_num_text_prefs;-text_prefsquery_flag<summary_flag<debug_level<debug_queryyrelation_flag<average_complete_flag <judged_docs_only_flag$<num_docs_in_coll'<relevance_level(< max_num_docs_per_topic+<$rel_info_format,y(results_format-y,zscore_flag.<0meas_arg3>4?EPI4?`?qidayrun_idbyret_formatcyq_resultse:RESULTSfOh?num_q_resultsi<max_num_q_resultsj<resultsk??ALL_RESULTSl?Dy?
>docnoysim?TEXT_RESULTS?num_text_results<max_num_text_results<text_results?>TEXT_RESULTS_INFOR!?qid"ydocno#ysim$yLINES%?te_get_trec_results6??v@A?n	epi6	?	text_results_file6y?	all_results7?
fd9??H
orig_buf:y?D
size;?l
ptr<?d
lines?n?@
line_ptr@n?`
num_linesA?\
num_qidB<?XCy??
q_results_ptrE??T
text_info_ptrFt?P
text_results_ptrG??L??comp_lines_qid_docno???{@D??	ptr1?n?	ptr2?n?
cmp???l
parse_results_line??!|@??b	start_ptr??	qid_ptr??	docno_ptr??	sim_ptr????
ptr?y?tte_get_trec_results_cleanup??@??
trec_results_buf0ylaB
text_info_pool1tpaB
text_results_pool2?taB
q_results_pool3?xaB__ctype_ptr__-yGNU C11 5.3.0 -mtune=generic -march=i686 -gget_zscores.c/cygdrive/h/Downloads/trec_eval-master/trec_eval-master?@U?signed charunsigned charshort intshort unsigned intintunsigned intlong long intlong long unsigned intlong doublecharlong intsizetypelong unsigned int`doublefloat?measure_nameTparametersTMEAS_ARGx8query_flag'summary_flag'debug_level'debug_queryTrelation_flag'average_complete_flag 'judged_docs_only_flag$'num_docs_in_coll''relevance_level(' max_num_docs_per_topic+'$rel_info_format,T(results_format-T,zscore_flag.'0meas_arg34?EPI4?T?\meas?Tmean?estddev?eZSCORE_QID?*ϧqid?Tnum_zscores?'zscoresҧ\ZSCORES?n??num_q_zscores?'q_zscores???ALL_ZSCORES׼0Dqid1Tmeas2Tmean3Tstddev4TLINES5te_get_zscoresC??@&?w	epiC?	zscores_fileCZ?	all_zscoresDw?
fdF??L
sizeG??l
ptrHT?D
current_qidIT?h
iJ'?d
linesK}?H
line_ptrL}?`
num_linesM'?\
num_qidN'?X
text_zscores_ptrP??T
zscores_ptrQ??P?Dcomp_lines_qid_meas??΃@D??	ptr1?}?	ptr2?}?
cmp???lparse_zscore_line???@~?p	start_ptr?$?	qid_ptr?$?	meas_ptr?$?	mean_ptr?$?	stddev_ptr?$?
ptr?T?t
te_get_zscores_cleanup????@m?
trec_zscores_buf>T|aB
text_zscores_pool???aB
zscores_pool@??aB__ctype_ptr__-T?	?GNU C11 5.3.0 -mtune=generic -march=i686 -gm_11pt_avg.c/cygdrive/h/Downloads/trec_eval-master/trec_eval-master?@=?signed charunsigned charshort intshort unsigned intintunsigned intlong long intlong long unsigned intlong doublecharlong intsizetypelong unsigned intdoublefloat?measure_nameUparametersUMEAS_ARGn8?query_flag(summary_flag(debug_level(debug_queryUrelation_flag(average_complete_flag (judged_docs_only_flag$(num_docs_in_coll'(relevance_level(( max_num_docs_per_topic+($rel_info_format,U(results_format-U,zscore_flag.(0meas_arg3?4?EPI4?83name9Uvalue<[TREC_EVAL_VALUE=@?qidAUnum_queriesB(valuesC?num_valuesD(max_num_valuesE(3TREC_EVALGJL
printable_paramsMUnum_paramsQ(param_valuesR&PARAMSS?`dqidaUrun_idbUret_formatcUq_resultse&RESULTSfn?qidoUrel_formatpUq_rel_infoq&REL_INFOrst?num_q_relsu(max_num_q_relsv(w??ALL_REL_INFOx?	trec_meas(|?name~Uexplanation?Uinit_meas?'calc_meas?|acc_meas??calc_avg_meas??print_single_meas??print_final_and_cleanup_meas?meas_params? eval_index?($
?!??
?PP[fq!Va?ldw-
??Pq?!???
??Pq?!??
??Pq??
?P!?
TREC_MEAS? E?num_rel_retG(num_retJ(num_nonpoolL(num_unjudged_in_poolN(num_relS(num_rel_levelsT(rel_levelsU?results_rel_listY?(RES_RELS]5
te_calc_11ptavg/??@=?	epi/P?/[?results0f?tm0	?eval0!?cutoff_percents2	?Lcutoffs3??Hcurrent_cut4(?lrr5???rel_so_far6(?hi7(?dprecis8[?@int_precis8[?Xsum9[?P	$[[2	4
float_cutoff_array"	? Adefault_11ptavg_cutoffs
8!Ate_meas_11pt_avg$`!A??
GNU C11 5.3.0 -mtune=generic -march=i686 -gm_binG.c/cygdrive/h/Downloads/trec_eval-master/trec_eval-master@?@??signed charunsigned charshort intshort unsigned intintunsigned intlong long intlong long unsigned intlong doublecharlong intsizetypelong unsigned intdoublefloat?measure_nameQparametersQMEAS_ARGj8?query_flag$summary_flag$debug_level$debug_queryQrelation_flag$average_complete_flag $judged_docs_only_flag$$num_docs_in_coll'$relevance_level($ max_num_docs_per_topic+$$rel_info_format,Q(results_format-Q,zscore_flag.$0meas_arg3?4?EPI4?8/name9Qvalue!_GAINS?o
te_calc_GG???@??
epiG??<G??resultsH??tmH
?evalHV?EJ!??results_gainKT??sum_resultsKT?hideal_gainLT?`sum_idealLT?Xsum_costMT?Pmin_costMT??results_gNT?Hcur_levelO!?DNO!?@iP!??gainsQ???~
Y
setup_gains??c?@??
tm?
?E??
?gains??
?pairs??
?lnum_pairs?!?hi?!?dj?!?`2?!?\?
!?8comp_rel_gain??u?@/?ptr1ݲ?ptr2ݲ?get_gain?T??@H?T[?T?gains?Y?i?!?t!_?default_G_gains?aBte_meas_GY "A$	?GNU C11 5.3.0 -mtune=generic -march=i686 -gm_gm_bpref.c/cygdrive/h/Downloads/trec_eval-master/trec_eval-master?@?? signed charunsigned charshort intshort unsigned intintunsigned intlong long intlong long unsigned intlong doublecharlong intsizetypelong unsigned intdoublefloat?measure_nameUparametersUMEAS_ARGn8?query_flag(summary_flag(debug_level(debug_queryUrelation_flag(average_complete_flag (judged_docs_only_flag$(num_docs_in_coll'(relevance_level(( max_num_docs_per_topic+($rel_info_format,U(results_format-U,zscore_flag.(0meas_arg3?4?EPI4?83name9Uvalue<[TREC_EVAL_VALUE=@?qidAUnum_queriesB(valuesC?num_valuesD(max_num_valuesE(3TREC_EVALGJL
printable_paramsMUnum_paramsQ(param_valuesR&PARAMSS?`dqidaUrun_idbUret_formatcUq_resultse&RESULTSfn?qidoUrel_formatpUq_rel_infoq&REL_INFOrst?num_q_relsu(max_num_q_relsv(ew??ALL_REL_INFOx?	trec_meas(|?name~Uexplanation?Uinit_meas?'calc_meas?|acc_meas??calc_avg_meas??print_single_meas??print_final_and_cleanup_meas?meas_params? eval_index?($
?!??
?PP[fq!Va?ldw-
??Pq?!???
??Pq?!??
??Pq??
?P!?
TREC_MEAS? E?num_rel_retG(num_retJ(num_nonpoolL(num_unjudged_in_poolN(num_relS(num_rel_levelsT(rel_levelsU?results_rel_listY?(RES_RELS]5
te_calc_gm_bpref%??@???epi%P?e%[?results&f?tm&??eval&!?res_rels(???j)(?lnonrel_so_far*(?hrel_so_far*(?dpool_unjudged_so_far*(?`num_nonrel+(?\bpref,[?P	$te_meas_gm_bpref$`"A??GNU C11 5.3.0 -mtune=generic -march=i686 -gm_gm_map.c/cygdrive/h/Downloads/trec_eval-master/trec_eval-master̓@?y!signed charunsigned charshort intshort unsigned intintunsigned intlong long intlong long unsigned intlong doublecharlong intsizetypelong unsigned intdoublefloat?measure_nameSparametersSMEAS_ARGl8?query_flag&summary_flag&debug_level&debug_querySrelation_flag&average_complete_flag &judged_docs_only_flag$&num_docs_in_coll'&relevance_level(& max_num_docs_per_topic+&$rel_info_format,S(results_format-S,zscore_flag.&0meas_arg3?4?EPI4?81name9Svalue?	
epiK??K??resultsL??tmL	
?evalL]?N ??results_gainO[??results_dcgO[?hideal_gainP[?`ideal_dcgP[?XsumQ[?P"R(?Lnum_relS(?Hcur_levelT(?D.T(?@iU(??gainsV???
`
setup_gains??J?@??
tm?	
???
?gains??
?pairs??
?lnum_pairs?(?hi?(?dj?(?`?(?\?
 ??comp_rel_gain??\?@/?ptr1??ptr2??get_gain?[??@H?M;?M?gains?R?i?(?t(X?default_ndcg_gains
?aBte_meas_ndcg_rel`?%A?{GNU C11 5.3.0 -mtune=generic -march=i686 -gm_num_nonrel_judged_ret.c/cygdrive/h/Downloads/trec_eval-master/trec_eval-master԰@w?*signed charunsigned charshort intshort unsigned intintunsigned intlong long intlong long unsigned intlong doublecharlong intsizetypelong unsigned int+doublefloat?measure_namebparametersbMEAS_ARG{8query_flag5summary_flag5debug_level5debug_querybrelation_flag5average_complete_flag 5judged_docs_only_flag$5num_docs_in_coll'5relevance_level(5 max_num_docs_per_topic+5$rel_info_format,b(results_format-b,zscore_flag.50meas_arg34?EPI4?8@name9bvalueECvD????Xarray????CXPREFS_ARRAY?????array???C?COUNTS_ARRAY??<?a	ecs?a	num_ecs?+prefs_array??rel_array?g	num_prefs_fulfilled_ret?+num_prefs_possible_ret?+num_prefs_fulfilled_imp?+ num_prefs_possible_imp?+$num_prefs_possible_notoccur?+(num_nonrel?+,num_nonrel_ret?+0num_rel?+4num_rel_ret?+8?nJG?&??	num_jgs?+jgs??	??+num_judged_ret?+pref_counts?m	RESULTS_PREFS?w	
te_calc_prefs_avgjg)?h?@$??
epi)Y??)d?results*o?tm*?
?eval+*?results_prefs-?	??i.+?lful/+?\poss/+?Xsum0d?`?
-te_meas_prefs_avgjg-`(A?
?"GNU C11 5.3.0 -mtune=generic -march=i686 -gm_prefs_avgjg_imp.c/cygdrive/h/Downloads/trec_eval-master/trec_eval-master??@
?.signed charunsigned charshort intshort unsigned intintunsigned intlong long intlong long unsigned intlong doublecharlong intsizetypelong unsigned int?%doublefloat?measure_namebparametersbMEAS_ARG{8query_flag/summary_flag/debug_level/debug_querybrelation_flag/average_complete_flag /judged_docs_only_flag$/num_docs_in_coll'/relevance_level(/ max_num_docs_per_topic+/$rel_info_format,b(results_format-b,zscore_flag./0meas_arg34?EPI4?8@name9bvalue
??a??2???
??a??2??
?
a???
?)a,2TREC_MEAS?(3o?rel_levelpvnum_in_ecq3docid_rankssFECvL???`array??>?K`PREFS_ARRAY????array?>?K?COUNTS_ARRAY??<?i	ecs?i	num_ecs?3prefs_array??rel_array?o	num_prefs_fulfilled_ret?3num_prefs_possible_ret?3num_prefs_fulfilled_imp?3 num_prefs_possible_imp?3$num_prefs_possible_notoccur?3(num_nonrel?3,num_nonrel_ret?30num_rel?34num_rel_ret?38?vJG?.??	num_jgs?3jgs??	>?3/?3pref_counts?u	RESULTS_PREFS?	
te_calc_prefs_avgjg_Rnonrel8?h?@??
epi8a?8l?results9w?tm9?
?eval:2?rpl?`R?3?\N?3?Xnum_ful@3??&@3???
5recalculatej??@2?2jgj2?/j=?ret_num_fuljF?ret_num_posskF?num_fulm3?p&n3?l??@??ec1s3?hec2s3?dptr1tF?`ptr2tF?\new_nonrel_ecv?????@?i?3?Xj?3?Tfirst_discarded_nonrel?3?Da???Lnum_nonrel_seen?3?P>?3?H8u	3te_meas_prefs_avgjg_Rnonrel5 )Af?%GNU C11 5.3.0 -mtune=generic -march=i686 -gm_prefs_avgjg_Rnonrel_ret.c/cygdrive/h/Downloads/trec_eval-master/trec_eval-master??@??1signed charunsigned charshort intshort unsigned intintunsigned intlong long intlong long unsigned intlong doublecharlong intsizetypelong unsigned int?-doublefloat?measure_namejparametersjMEAS_ARG?8query_flag7summary_flag7debug_level7debug_queryjrelation_flag7average_complete_flag 7judged_docs_only_flag$7num_docs_in_coll'7relevance_level(7 max_num_docs_per_topic+7$rel_info_format,j(results_format-j,zscore_flag.70meas_arg34?EPI4?8Hname9jvalue6?rp@?	??iA7?lsumBp?`RC7?\NC7?Xnum_fulD7??]D7???
9recalculatekx?@??,jgk,?fk7?ret_num_fulkJ?ret_num_posslJ?num_fuln7?p]o7?l??@?ec1t7?hec2t7?dptr1uJ?`ptr2uJ?\new_nonrel_ecw?????@}i?7?Xj?7?Tfirst_discarded_nonrel?7?Ha???Lnum_nonrel_seen?7?P2y	7te_meas_prefs_avgjg_Rnonrel_ret9`)A?
'GNU C11 5.3.0 -mtune=generic -march=i686 -gm_prefs_num_prefs_ful.c/cygdrive/h/Downloads/trec_eval-master/trec_eval-masterH?@?f3signed charunsigned charshort intshort unsigned intintunsigned intlong long intlong long unsigned intlong doublecharlong intsizetypelong unsigned int?)doublefloat?measure_namefparametersfMEAS_ARG8query_flag3summary_flag3debug_level3debug_queryfrelation_flag3average_complete_flag 3judged_docs_only_flag$3num_docs_in_coll'3relevance_level(3 max_num_docs_per_topic+3$rel_info_format,f(results_format-f,zscore_flag.30meas_arg34?EPI4?8Dname9fvalue
??a??2???
??a??2??
?
a???
?)a,2TREC_MEAS?(3o?rel_levelpvnum_in_ecq3docid_rankssFECvL????`array????K`PREFS_ARRAY?????array???K?COUNTS_ARRAY??<?i	ecs?i	num_ecs?3prefs_array??rel_array?o	num_prefs_fulfilled_ret?3num_prefs_possible_ret?3num_prefs_fulfilled_imp?3 num_prefs_possible_imp?3$num_prefs_possible_notoccur?3(num_nonrel?3,num_nonrel_ret?30num_rel?34num_rel_ret?38?vJG?.??	num_jgs?3jgs??	??3num_judged_ret?3pref_counts?u	RESULTS_PREFS?	
te_calc_prefs_num_prefs_ful#?H?@???
epi#a??#l?results$w?tm$?
?eval%2?results_prefs'?	?Li(3?lful)3?h?
5te_meas_prefs_num_prefs_ful5?)A?
?'GNU C11 5.3.0 -mtune=generic -march=i686 -gm_prefs_num_prefs_ful_ret.c/cygdrive/h/Downloads/trec_eval-master/trec_eval-master??@??3signed charunsigned charshort intshort unsigned intintunsigned intlong long intlong long unsigned intlong doublecharlong intsizetypelong unsigned int?-doublefloat?measure_namejparametersjMEAS_ARG?8query_flag7summary_flag7debug_level7debug_queryjrelation_flag7average_complete_flag 7judged_docs_only_flag$7num_docs_in_coll'7relevance_level(7 max_num_docs_per_topic+7$rel_info_format,j(results_format-j,zscore_flag.70meas_arg34?EPI4?8Hname9jvalue%??%gain@XREL_GAINA5C?rel_gainsD??E%total_num_at_levelsF%cGAINSGs
te_calc_RndcgO???@:?&
epiO???O??resultsP??tmP&
?evalPZ??R%??results_gainSX??results_dcgSX?hold_ideal_gainTX?`ideal_gainTX?Xideal_dcgTX?PsumUX?Hnum_changed_ideal_gainV%?Dcur_levelW%?@?W%??iX%??gainsY???,
]
setup_gains??
?@??
tm?&
????
?gains??
?pairs??
?lnum_pairs?%?hi?%?dj?%?`??%?\?
%?<comp_rel_gain???@/?!ptr1??ptr2??get_gain?XK?@H?j??j?gains?o?i?%?t%u?default_ndcg_gains
?aBte_meas_Rndcg]?-A?4GNU C11 5.3.0 -mtune=generic -march=i686 -gm_Rprec.c/cygdrive/h/Downloads/trec_eval-master/trec_eval-master??@??=signed charunsigned charshort intshort unsigned intintunsigned intlong long intlong long unsigned intlong doublecharlong intsizetypelong unsigned intdoublefloat?measure_nameRparametersRMEAS_ARGk8?query_flag%summary_flag%debug_level%debug_queryRrelation_flag%average_complete_flag %judged_docs_only_flag$%num_docs_in_coll'%relevance_level(% max_num_docs_per_topic+%$rel_info_format,R(results_format-R,zscore_flag.%0meas_arg3?4?EPI4?80name9Rvaluesigned charunsigned charshort intshort unsigned intintunsigned intlong long intlong long unsigned intlong doublecharlong intsizetypelong unsigned int doublefloat?measure_nameWparametersWMEAS_ARGp8query_flag*summary_flag*debug_level*debug_queryWrelation_flag*average_complete_flag *judged_docs_only_flag$*num_docs_in_coll'*relevance_level(* max_num_docs_per_topic+*$rel_info_format,W(results_format-W,zscore_flag.*0meas_arg34?EPI4?85name9Wvalue<]TREC_EVAL_VALUE=@?qidAWnum_queriesB*valuesC?num_valuesD*max_num_valuesE*5TREC_EVALGLLprintable_paramsMWnum_paramsQ*param_valuesR(PARAMSS?`fqidaWrun_idbWret_formatcWq_resultse(RESULTSfn?qidoWrel_formatpWq_rel_infoq(REL_INFOrut?num_q_relsu*max_num_q_relsv*?w??ALL_REL_INFOx?	trec_meas(|?name~Wexplanation?Winit_meas?)calc_meas?~acc_meas??calc_avg_meas??print_single_meas??print_final_and_cleanup_meas?meas_params?  eval_index?*$
?#??
?RR]hs#Xc?nfy/
??Rs?#???
??Rs?#??
??Rs??
?R#TREC_MEAS? E?num_rel_retG*num_retJ*num_nonpoolL*num_unjudged_in_poolN*num_relS*num_rel_levelsT*rel_levelsU?results_rel_listY?*RES_RELS]7
te_calc_Rprec_mult+?D?@?	epi+R??+]?results,h?tm,	?eval-#?cutoff_percents/	?Tcutoffs0??Pcurrent_cut1*?lrr2???rel_so_far3*?hi4*?dprecis5]?Hint_precis5]?X	&]])	6	Rprec_cutoff_array	.Adefault_Rprec_cutoffsP.Ate_meas_Rprec_mult&`.A?	?5GNU C11 5.3.0 -mtune=generic -march=i686 -gm_Rprec_mult_avgjg.c/cygdrive/h/Downloads/trec_eval-master/trec_eval-masterX?@?>signed charunsigned charshort intshort unsigned intintunsigned intlong long intlong long unsigned intlong doublecharlong intsizetypelong unsigned int&doublefloat?measure_name]parameters]MEAS_ARGv8query_flag0summary_flag0debug_level0debug_query]relation_flag0average_complete_flag 0judged_docs_only_flag$0num_docs_in_coll'0relevance_level(0 max_num_docs_per_topic+0$rel_info_format,](results_format-],zscore_flag.00meas_arg34?EPI4?8;name9]value??epi*M?tm*??eval*?!runid R?aBte_meas_runid!@/A*	?7GNU C11 5.3.0 -mtune=generic -march=i686 -gm_set_F.c/cygdrive/h/Downloads/trec_eval-master/trec_eval-master??@?@signed charunsigned charshort intshort unsigned intintunsigned intlong long intlong long unsigned intlong doublecharlong intsizetypelong unsigned intdoublefloat?measure_nameRparametersRMEAS_ARGk8?query_flag%summary_flag%debug_level%debug_queryRrelation_flag%average_complete_flag %judged_docs_only_flag$%num_docs_in_coll'%relevance_level(% max_num_docs_per_topic+%$rel_info_format,R(results_format-R,zscore_flag.%0meas_arg3?4?EPI4?80name9RvalueGNU C11 5.3.0 -mtune=generic -march=i686 -gm_yaap.c/cygdrive/h/Downloads/trec_eval-master/trec_eval-masterx?@?nCsigned charunsigned charshort intshort unsigned intintunsigned intlong long intlong long unsigned intlong doublecharlong intsizetypelong unsigned intdoublefloat?measure_nameQparametersQMEAS_ARGj8?query_flag$summary_flag$debug_level$debug_queryQrelation_flag$average_complete_flag $judged_docs_only_flag$$num_docs_in_coll'$relevance_level($ max_num_docs_per_topic+$$rel_info_format,Q(results_format-Q,zscore_flag.$0meas_arg3?4?EPI4?8/name9QvalueT?h?>T?d???	?\i@'?`get_float_paramsp?>?@>??pM??pT??r'?lptrsT?h?sT?d?t?	?`get_param_pairs??|?@?????M???T???'?llast_seen??kptr?T?d??T?`????\.comp_long??K?@??ptr1???ptr2???comp_float??^?@:?5
ptr1??	?ptr2??	?append_long?T??@_??
??T???'???'?l?T?happend_float?T??@k??
??T???Z?X??'?l?T?happend_string?Tb?@~???T???T???'?l?T?h	?BGNU C11 5.3.0 -mtune=generic -march=i686 -gmeas_print_final.c/cygdrive/h/Downloads/trec_eval-master/trec_eval-master??@?\Gsigned charunsigned charshort intshort unsigned intintunsigned intlong long intlong long unsigned intlong doublecharlong intsizetypelong unsigned int$doublefloat?measure_name[parameters[MEAS_ARGt8query_flag.summary_flag.debug_level.debug_query[relation_flag.average_complete_flag .judged_docs_only_flag$.num_docs_in_coll'.relevance_level(. max_num_docs_per_topic+.$rel_info_format,[(results_format-[,zscore_flag..0meas_arg34?EPI4?89name9[valuete_meas_num_ret?te_meas_num_rel@te_meas_num_rel_retAte_meas_mapBte_meas_gm_mapCte_meas_RprecDte_meas_bprefEte_meas_recip_rankFte_meas_iprec_at_recallGte_meas_PHte_meas_relstringIte_meas_recallJte_meas_infAPKte_meas_gm_bprefLte_meas_Rprec_multMte_meas_utilityNte_meas_11pt_avgOte_meas_binGPte_meas_GQte_meas_ndcgRte_meas_ndcg_relSte_meas_RndcgTte_meas_ndcg_cutUte_meas_map_cutVte_meas_relative_PWte_meas_successXte_meas_set_PYte_meas_set_relative_PZte_meas_set_recall[te_meas_set_map\te_meas_set_F]te_meas_num_nonrel_judged_ret^te_meas_prefs_num_prefs_poss_te_meas_prefs_num_prefs_ful`te_meas_prefs_num_prefs_ful_retate_meas_prefs_simpbte_meas_prefs_paircte_meas_prefs_avgjgdte_meas_prefs_avgjg_Rnonrelete_meas_prefs_simp_retfte_meas_prefs_pair_retgte_meas_prefs_avgjg_rethte_meas_prefs_avgjg_Rnonrel_retite_meas_prefs_simp_impjte_meas_prefs_pair_impkte_meas_prefs_avgjg_implte_meas_map_avgjgmte_meas_P_avgjgnte_meas_Rprec_mult_avgjgote_meas_yaapp?
?

*3te_trec_measuresr?
2Ate_num_trec_measures???2A^
*te_trec_measure_nicknames??4Ate_num_trec_measure_nicknames???4A?uEGNU C11 5.3.0 -mtune=generic -march=i686 -gtrec_eval.c/cygdrive/h/Downloads/trec_eval-master/trec_eval-master$APIsigned charunsigned charshort intshort unsigned intintunsigned intlong long intlong long unsigned intlong doublecharlong intsizetypelong unsigned int`doublefloat?optionD?EZhas_argF?flagGxvalH?	?measure_nameTparametersT
MEAS_ARG?	8Squery_flag'summary_flag'debug_level'debug_queryTrelation_flag'average_complete_flag 'judged_docs_only_flag$'num_docs_in_coll''relevance_level(' max_num_docs_per_topic+'$rel_info_format,T(results_format-T,zscore_flag.'0meas_arg3S4?
EPI4	8?9Tvalue?	__tm_isdst?? IH?
	_fnargsI?
	_dso_handleJ?
?_fntypesL?_is_cxaO?
F?

_atexit?[?
	_next\?
	_ind]?	_fns_?
I`+
??

?
?

?
 __sbufs	_baset?	_sizeu?!?==F?)C"_reent@8?#?VV
$?X

%_unused_randY?%_strtok_lastZ?%_asctime_buf[?%_localtime_buf\}	$%_gamma_signgam]?H%_rand_next^?P%_r48_jX%_mblen_state`h%_mbtowc_stateap%_wctomb_statebx%_l64a_bufc??%_signal_bufd??%_getdate_erre??%_mbrlen_statef?%_mbrtowc_stateg?%_mbsrtowcs_stateh?%_wcrtomb_statei?%_wcsrtombs_statej?%_h_errnok??$?q6
%_nextfs?%_nmalloct?x&_reentl]&_unusedu

%_errno:?%_stdin?d%_stdout?d%_stderr?d%_incA?%_emergencyB?%_current_categoryD?0%_current_localeE"4%__sdidinitG?8%__cleanupI<%_resultLg	@%_result_kM?D%_p5sNg	H%_freelistOL%_cvtlenR?P%_cvtbufS?T%_newvTX'_atexitz?
H'_atexit0{?
L'_sig_func*?'__sglue? ?'__sf?6?!?	=F")?!d-=Fd?!?G=F3
}]

}m
__sFILE64p??	_p?	_r??	_w??	_flags?	_file?	_bf??
	_lbfsize??	_data?=	_cookie?F 	_read??$	_write?	(%_seek-,%_closeG0%_ub?
4%_up?<%_ur?@%_ubuf	MD%_nbuf
]G%_lb
?
H%_blksize?P%_flags2?T%_offsetTX%_seek64`%_lock+d%_mbstateh!s=Fs??__FILEm(_glue!^%_next#^%_niobs$?%_iobs%d (_rand48=?%_seed>?%_mult??%_add@?
??

??

??

??

??

?
)=g	)*?0
mF

kV
HINSTANCE__
?z	unused
??HINSTANCE
??VHMODULE
?z_SECURITY_ATTRIBUTES

	nLength
k	lpSecurityDescriptor
?	bInheritHandle
\SECURITY_ATTRIBUTES
?Fdouble?
kC
(_KSYSTEM_TIME?%LowPartO%High1Time?%High2Time?KSYSTEM_TIMEC"_KUSER_SHARED_DATA?%Reserved1
?%InterruptTime?%Reserved2'DismountCountO?
}+*
?KUSER_SHARED_DATA?+e20cygwin_getinfo_types?,CW_LOCK_PINFO,CW_UNLOCK_PINFO,CW_GETTHREADNAME,CW_GETPINFO,CW_SETPINFO,CW_SETTHREADNAME,CW_GETVERSIONINFO,CW_READ_V1_MOUNT_TABLES,CW_USER_DATA,CW_PERFILE	,CW_GET_CYGDRIVE_PREFIXES
,CW_GETPINFO_FULL,CW_INIT_EXCEPTIONS,CW_GET_CYGDRIVE_INFO
,CW_SET_CYGWIN_REGISTRY_NAME,CW_GET_CYGWIN_REGISTRY_NAME,CW_STRACE_TOGGLE,CW_STRACE_ACTIVE,CW_CYGWIN_PID_TO_WINPID,CW_EXTRACT_DOMAIN_AND_USER,CW_CMDLINE,CW_CHECK_NTSEC,CW_GET_ERRNO_FROM_WINERROR,CW_GET_POSIX_SECURITY_ATTRIBUTE,CW_GET_SHMLBA,CW_GET_UID_FROM_SID,CW_GET_GID_FROM_SID,CW_GET_BINMODE,CW_HOOK,CW_ARGV,CW_ENVP,CW_DEBUG_SELF,CW_SYNC_WINENV ,CW_CYGTLS_PADSIZE!,CW_SET_DOS_FILE_WARNING",CW_SET_PRIV_KEY#,CW_SETERRNO$,CW_EXIT_PROCESS%,CW_SET_EXTERNAL_TOKEN&,CW_GET_INSTKEY',CW_INT_SETLOCALE(,CW_CVT_MNT_OPTS),CW_LST_MNT_OPTS*,CW_STRERROR+,CW_CVT_ENV_TO_WINENV,,CW_ALLOC_DRIVE_MAP-,CW_MAP_DRIVE_MAP.,CW_FREE_DRIVE_MAP/,CW_SETENT0,CW_GETENT1,CW_ENDENT2,CW_GETNSSSEP3,CW_GETPWSID4,CW_GETGRSID5,CW_CYGNAME_FROM_WINNAME6,CW_FIXED_ATEXIT7,CW_GETNSS_PWD_SRC8,CW_GETNSS_GRP_SRC9,CW_EXCEPTION_RECORD_FROM_SIGINFO_T:cygwin_getinfo_types?E(per_process?-l%initial_sp/?%magic_biscuit3J%dll_major4J%dll_minor5J%impure_ptr_ptr7l%envptr9r%malloc>?%free??%realloc@? %fmode_ptrBI$%mainD?(%ctorsE?,%dtorsF?0%data_startIF4%data_endJF8%bss_startKF<%bss_endLF@%callocN?D%premainP?H%run_ctors_pS;X%unusedU3\%cxx_mallocX?x%hmoduleZ?|%api_major\k?%api_minor]k?%unused2cF?%posix_memalignf?%pseudo_reloc_starthF?%pseudo_reloc_endiF?%image_basejF?%threadinterfacemb?%impure_ptrq=?=-!F?)x)?F?!F?F)?!???--??
!F?))?


)"?-"?per_process_cxx_malloc ?	oper_new?	oper_new__?	oper_delete?	oper_delete__?	oper_new_nt		oper_new___nt		oper_delete_nt	oper_delete___nt(!?))?-MTinterface?b.concurrency??.threadcount??.pthread_prepare??.pthread_child??.pthread_parent??/Init?_ZN11MTinterface4InitEv??b/fixup_before_fork?_ZN11MTinterface17fixup_before_forkEvb0fixup_after_fork?_ZN11MTinterface16fixup_after_forkEv[b-callback??.cb??
.next??hMainFunc?1std?2nothrow_tcsize_t???!F???!F???)F3_cygwin_crt0_common]_cygwin_crt0_common@8??A,??4f]??5u]"67newu_"57uwasnull`S8?A"?9t88NA."?9t0:?Ac"9t0;B	;?<_impure_ptr:=
?	 =_data_start__?>_data_end__?>_bss_start__?>_bss_end__?>__CTOR_LIST__?
>__DTOR_LIST__?
?CYGTLS_PADSIZE
? ?1?@SharedUserData?? ?? ? A+>sec_none_nihBcw_std_mask
!?EP7.{0P?A?tut?AWA?sA??A?ttut0tu?A?_cygwin_noncygwin_dll_entry_cygwin_noncygwin_dll_entry@12?A	?h??reasonh?ptrf?AstoredHandle??aBstoredReasonh?aBstoredPtrf?aBdll_index??aB __dynamically_loaded??aB!DllMainDllMain@12???hf"cygwin_attach_dll????#cygwin_detach_dll??=OGNU C++ 4.9.3 20150626 (Fedora Cygwin 4.9.3-1) -mtune=generic -march=i686 -g -ggdb -O2 -fdebug-prefix-map=/home/corinna/src/cygwin/cygwin-2.4.1/cygwin-2.4.1-1.i686/build=/usr/src/debug/cygwin-2.4.1-1 -fdebug-prefix-map=/home/corinna/src/cygwin/cygwin-2.4.1/cygwin-2.4.1-1.i686/src/newlib-cygwin=/usr/src/debug/cygwin-2.4.1-1 -fno-rtti -fno-exceptions -fno-use-cxa-atexit -fno-common -fbuiltin -fmessage-length=0 -fmerge-constants -ftracer/usr/src/debug/cygwin-2.4.1-1/winsup/cygwin/lib/dll_main.cc A}Tintunsigned intcharlong intlong long intshort unsigned intwchar_tsizetypelong unsigned intunsigned charBOOLrDWORD|ufloatLPVOID??signed charshort intlong long unsigned intHINSTANCE__?'unused?HINSTANCE?8doubleDllMainDllMain@12? A?	hInst'?	reason??	reserved??	?OGNU C 4.9.3 20150626 (Fedora Cygwin 4.9.3-1) -mtune=generic -march=i686 -g -ggdb -O2 -fdebug-prefix-map=/home/corinna/src/cygwin/cygwin-2.4.1/cygwin-2.4.1-1.i686/build=/usr/src/debug/cygwin-2.4.1-1 -fdebug-prefix-map=/home/corinna/src/cygwin/cygwin-2.4.1/cygwin-2.4.1-1.i686/src/newlib-cygwin=/usr/src/debug/cygwin-2.4.1-1 -fno-common -fbuiltin -fmessage-length=0 -fmerge-constants -ftracer/usr/src/debug/cygwin-2.4.1-1/winsup/cygwin/lib/pseudo-reloc-dummy.c0A*U_pei386_runtime_relocator0A?T?OGNU C 4.9.3 20150626 (Fedora Cygwin 4.9.3-1) -mtune=generic -march=i686 -g -ggdb -O2 -fdebug-prefix-map=/home/corinna/src/cygwin/cygwin-2.4.1/cygwin-2.4.1-1.i686/build=/usr/src/debug/cygwin-2.4.1-1 -fdebug-prefix-map=/home/corinna/src/cygwin/cygwin-2.4.1/cygwin-2.4.1-1.i686/src/newlib-cygwin=/usr/src/debug/cygwin-2.4.1-1 -fno-common -fbuiltin -fmessage-length=0 -fmerge-constants -ftracer/usr/src/debug/cygwin-2.4.1-1/winsup/cygwin/lib/cygwin_attach_dll.c`A4?Uintsize_t??unsigned intshort unsigned intcharlong intlong long int?sizetypelong unsigned intunsigned charDWORD|\floatsigned charshort intqlong long unsigned intULONG_PTR7\DWORD_PTR??__int32_t??__uint32_tA?_LOCK_T?_off64_t3_fpos_t''_fpos64_t-*_ssize_t7?wint_ta?J?__wchLj__wchbM?	q?
PG?__countI?__valueNy_mbstate_tO?_flock_tS__ULong\
_Bigint-g_next/g_k0?_maxwds0?_sign0?_wds0?_x1m	?}
P
__tm$5+__tm_sec7?__tm_min8?__tm_hour9?__tm_mday:?__tm_mon;?__tm_year?__tm_isdst?? WH?_fnargsI?_dso_handleJ??_fntypesL?_is_cxaO?	??
P_atexit?[?_next\?_ind]?_fns_?W`+??	??
P?
__sbufs_baset?_sizeu?Z==?D?C_reent@8?_errno:?_stdin?m
_stdout?m
_stderr?m
_incA?_emergencyB
_current_categoryD?0_current_localeE4__sdidinitG?8__cleanupI.
<_resultLg@_result_kM?D_p5sNgH_freelistO4
L_cvtlenR?P_cvtbufSDT_newv?X_atexitz?H_atexit0{?L_sig_funcE
?__sglue?)
?__sf?Q
?Z=??
?:6=?:??P=?<	qf
P	qv
P
__sFILE64p??	_p?_r??_w??_flags?_file?_bf??_lbfsize??_data?=_cookie?? _read??$_write?(_seek6,_closeP0_ub?4_up?<_ur?@_ubuf	VD_nbuf
fG_lb
?H_blksize?P_flags2?T_offset*X_seek64
`_lock?d_mbstate?hI
=?I??	__FILEv_glue!g
_next#g
_niobs$?_iobs%m
)

_rand48=?
_seed>?
_mult??
_add@			?

P?Wn_unused_randY?_strtok_lastZD_asctime_buf[n_localtime_buf\}$_gamma_signgam]?H_rand_next^?P_r48_s
X_mblen_state`?h_mbtowc_statea?p_wctomb_stateb?x_l64a_bufc~?_signal_bufd??_getdate_erre??_mbrlen_statef??_mbrtowc_stateg??_mbsrtowcs_stateh??_wcrtomb_statei??_wcsrtombs_statej??_h_errnok??	~
P	?
P	?
P?p?_nextfs?_nmalloct?x	??
P	??
P?U
_reentl?
_unusedu?	#

P.
=#
gE
?K
:
	
a

P
HINSTANCE__??
unused??HINSTANCE??
a
HMODULEŅ
?doubleint32_t	?uint32_t	 	per_process?
-Rinitial_sp
/Dmagic_biscuit
3?
dll_major
4?
dll_minor
5?
impure_ptr_ptr
7Renvptr
9Xmalloc
>sfree
??realloc
@? fmode_ptr
BJ$main
D?(ctors
E?,dtors
F?0data_start
I?4data_end
J?8bss_start
K?<bss_end
L?@calloc
N?Dpremain
P?Hrun_ctors_p
S?
Xunused
U\cxx_malloc
X<xhmodule
Z?
|api_major
\??api_minor
]??unused2
cB?posix_memalign
fk?pseudo_reloc_start
h??pseudo_reloc_end
i??image_base
j??threadinterface
o??impure_ptr
q=?=^D?s?d??y????????^^???????	??
P??^?
	?$
Pper_process_cxx_malloc$	?R
P?k?
??RMainFunc?cygwin_attach_dll?`A4?? h?
? fq?!u?
bB"|A??#t?#tbB$?A9%_cygwin_crt0_common_cygwin_crt0_common@8?9q&dll_dllcrt0??
??QGNU C11 5.3.0 -mtune=generic -march=i686 -g -ggdb -g -ggdb -g -O2 -O2 -O2 -fdebug-prefix-map=/cygdrive/i/szsz/tmpp/gcc/gcc-5.3.0-3.i686/build=/usr/src/debug/gcc-5.3.0-3 -fdebug-prefix-map=/cygdrive/i/szsz/tmpp/gcc/gcc-5.3.0-3.i686/src/gcc-5.3.0=/usr/src/debug/gcc-5.3.0-3 -fdebug-prefix-map=/cygdrive/i/szsz/tmpp/gcc/gcc-5.3.0-3.i686/build=/usr/src/debug/gcc-5.3.0-3 -fdebug-prefix-map=/cygdrive/i/szsz/tmpp/gcc/gcc-5.3.0-3.i686/src/gcc-5.3.0=/usr/src/debug/gcc-5.3.0-3 -fbuilding-libgcc -fno-stack-protector/usr/src/debug/gcc-5.3.0-3/libgcc/libgcc2.c/usr/src/debug/gcc-5.3.0-3/i686-pc-cygwin/libgccXintunsigned intshort unsigned intlong long intlong doublecharsigned charunsigned charshort intlong long unsigned intlong intsizetypelong unsigned int:ix86_tune_indicesrb[
X86_TUNE_SCHEDULEX86_TUNE_PARTIAL_REG_DEPENDENCYX86_TUNE_SSE_PARTIAL_REG_DEPENDENCYX86_TUNE_SSE_SPLIT_REGSX86_TUNE_PARTIAL_FLAG_REG_STALLX86_TUNE_MOVXX86_TUNE_MEMORY_MISMATCH_STALLX86_TUNE_FUSE_CMP_AND_BRANCH_32X86_TUNE_FUSE_CMP_AND_BRANCH_64X86_TUNE_FUSE_CMP_AND_BRANCH_SOFLAGS	X86_TUNE_FUSE_ALU_AND_BRANCH
X86_TUNE_REASSOC_INT_TO_PARALLELX86_TUNE_REASSOC_FP_TO_PARALLELX86_TUNE_ACCUMULATE_OUTGOING_ARGS
X86_TUNE_PROLOGUE_USING_MOVEX86_TUNE_EPILOGUE_USING_MOVEX86_TUNE_USE_LEAVEX86_TUNE_PUSH_MEMORYX86_TUNE_SINGLE_PUSHX86_TUNE_DOUBLE_PUSHX86_TUNE_SINGLE_POPX86_TUNE_DOUBLE_POPX86_TUNE_PAD_SHORT_FUNCTIONX86_TUNE_PAD_RETURNSX86_TUNE_FOUR_JUMP_LIMITX86_TUNE_SOFTWARE_PREFETCHING_BENEFICIALX86_TUNE_LCP_STALLX86_TUNE_READ_MODIFYX86_TUNE_USE_INCDECX86_TUNE_INTEGER_DFMODE_MOVESX86_TUNE_OPT_AGUX86_TUNE_AVOID_LEA_FOR_ADDRX86_TUNE_SLOW_IMUL_IMM32_MEM X86_TUNE_SLOW_IMUL_IMM8!X86_TUNE_AVOID_MEM_OPND_FOR_CMOVE"X86_TUNE_SINGLE_STRINGOP#X86_TUNE_MISALIGNED_MOVE_STRING_PRO_EPILOGUES$X86_TUNE_USE_SAHF%X86_TUNE_USE_CLTD&X86_TUNE_USE_BT'X86_TUNE_USE_HIMODE_FIOP(X86_TUNE_USE_SIMODE_FIOP)X86_TUNE_USE_FFREEP*X86_TUNE_EXT_80387_CONSTANTS+X86_TUNE_VECTORIZE_DOUBLE,X86_TUNE_GENERAL_REGS_SSE_SPILL-X86_TUNE_SSE_UNALIGNED_LOAD_OPTIMAL.X86_TUNE_SSE_UNALIGNED_STORE_OPTIMAL/X86_TUNE_SSE_PACKED_SINGLE_INSN_OPTIMAL0X86_TUNE_SSE_TYPELESS_STORES1X86_TUNE_SSE_LOAD0_BY_PXOR2X86_TUNE_INTER_UNIT_MOVES_TO_VEC3X86_TUNE_INTER_UNIT_MOVES_FROM_VEC4X86_TUNE_INTER_UNIT_CONVERSIONS5X86_TUNE_SPLIT_MEM_OPND_FOR_FP_CONVERTS6X86_TUNE_USE_VECTOR_FP_CONVERTS7X86_TUNE_USE_VECTOR_CONVERTS8X86_TUNE_SLOW_PSHUFB9X86_TUNE_VECTOR_PARALLEL_EXECUTION:X86_TUNE_AVOID_4BYTE_PREFIXES;X86_TUNE_AVX256_UNALIGNED_LOAD_OPTIMAL<X86_TUNE_AVX256_UNALIGNED_STORE_OPTIMAL=X86_TUNE_AVX128_OPTIMAL>X86_TUNE_DOUBLE_WITH_ADD?X86_TUNE_ALWAYS_FANCY_MATH_387@X86_TUNE_UNROLL_STRLENAX86_TUNE_SHIFT1BX86_TUNE_ZERO_EXTEND_WITH_ANDCX86_TUNE_PROMOTE_HIMODE_IMULDX86_TUNE_FAST_PREFIXEX86_TUNE_READ_MODIFY_WRITEFX86_TUNE_MOVE_M1_VIA_ORGX86_TUNE_NOT_UNPAIRABLEHX86_TUNE_PARTIAL_REG_STALLIX86_TUNE_PROMOTE_QIMODEJX86_TUNE_PROMOTE_HI_REGSKX86_TUNE_HIMODE_MATHLX86_TUNE_SPLIT_LONG_MOVESMX86_TUNE_USE_XCHGBNX86_TUNE_USE_MOV0OX86_TUNE_NOT_VECTORMODEPX86_TUNE_AVOID_VECTOR_DECODEQX86_TUNE_AVOID_FALSE_DEP_FOR_BMIRX86_TUNE_BRANCH_PREDICTION_HINTSSX86_TUNE_QIMODE_MATHTX86_TUNE_PROMOTE_QI_REGSUX86_TUNE_ADJUST_UNROLLVX86_TUNE_LASTWix86_arch_indicesr??
X86_ARCH_CMOVX86_ARCH_CMPXCHGX86_ARCH_CMPXCHG8BX86_ARCH_XADDX86_ARCH_BSWAPX86_ARCH_LASTfloatcomplex floatdoublecomplex doublecomplex long double__float128 __unknown__func_ptr*4Nn	
__CTOR_LIST__		^?A
__DTOR_LIST__
	^?A%$>:;II:;I:;
:;I	I
!I/:;
:;I8
:;:;
:;I8:;
:;I8''II:;
:;I8
:;I8&I:;:;:;
:;I'<.?:;@?B ??1!??"???B#??1$.?:;'<%.?:;'<&.?:;'<%$>I&I:;
:;I8:;I
:;I8	.?:;'I@?B
:;I:;I4:;I
4:;I.:;'I@?B%$>I:;
:;I8:;I&I	
:;I8
.?:;'I@?B:;I:;I
4:;I4:;I.:;'I@?B:;I:;I4:;I4:;I.:;'I@?B.:;'@?B.:;I@?B.?:;I@?B%$>I:;
:;I8:;I&I	
:;I8
.?:;'I@?B:;I4:;I
.:;I@?B:;I.:;I@?B.?:;I@?B4:;I%$>I:;
:;I8:;I&I	
:;I8
.?:;'I@?B:;I4:;I
.:;I@?B:;I.:;I@?B.?:;I@?B4:;I%$>I:;
:;I8:;I
:;I8	'I
II
I!I/4:;I?4:;I?%$>I:;
:;I8:;I.?:;'I@?B	:;I
4:;I.:;'I@?B.:;'I@?B
:;I4:;I.?:;I@?B4:;I?<%$>I:;
:;I8:;I.?:;'I@?B	:;I
4:;I.:;'I@?B.:;'I@?B
.?:;I@?B4:;I?<%$>I:;
:;I8:;I.?:;'I@?B	:;I
4:;I.:;'I@?B.:;'I@?B
.?:;I@?B4:;I?<%$>I:;
:;I8:;I.?:;'I@?B	:;I
4:;I.:;'I@?B.:;'I@?B
.?:;I@?B4:;I?<%$>:;II:;
:;I8.?:;'I@?B	:;I
4:;I4:;I.:;'I@?B
.:;'I@?B:;I.?:;I@?B4:;I?<%$>I&I:;
:;I8:;I.?:;'I@?B	:;I
4:;I.:;'I@?B.:;'I@?B
.?:;I@?B4:;I?<%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;II!I/4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I.:;'I@?B4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;II!I/4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;II!I/4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I.:;'I@?B4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;II!I/4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I.:;'I@?B4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I.:;'I@?B4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;II!I/4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;II!I/4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I.:;'@?B4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I.:;'@?B4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;II!I/4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;II!I/4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;II!I/4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I.:;'I@?B4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;II!I/4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;II!I/4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I.:;'I@?B4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;II!I/4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;II!I/4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;II!I/4:;I?%$>I:;
:;I8:;I
:;I8	:;
'II&I
.:;'I@?B:;I:;I4:;I4:;I?%$>I:;
:;I8:;I:;	'I
I&I.?:;'I@?B
:;I:;I.?:;'I@?B4:;I%$>I:;
:;I8:;I
:;I8	:;
'II&I
.?:;'I@?B:;I:;I4:;I4:;I.?:;'I@?B%$>I:;
:;I8:;I
:;I8	:;
'II&I
.?:;'I@?B:;I:;I.?:;'I@?B4:;I4:;I.:;'I@?B:;I4:;I4:;I.:;'I@?B:;I.:;'I@?B%$>I:;
:;I8:;I:;	'I
I&I.?:;'I@?B
:;I.?:;'I@?B4:;I.?:;'I@?B%$>I:;
:;I8:;I:;	'I
I&I.?:;'I@?B
:;I.?:;'I@?B.?:;'I@?B4:;I%$>I:;
:;I8:;I:;	'I
I&II
!I/4:;I4:;I?<4:;I?%$>I&I:;
:;I8
:;I8	:;
:;I'II
I.?:;I@?B:;I4:;IUI!I/.:;'I@?B:;I4:;I:;I4:;I.:;'@?B.:;'I@?B4:;I?<4:;I?<!%$>:;II.?:;'I@?B:;I:;I	&I%$>I:;I'II.?:;'@?B:;I	??1
???B<.?:;n'I<%.?:;'@?B:;I$>I%.?:;'@?B:;I$>I%.?:;'@?B:;I$>I%.?:;'@?B:;I$>I%$>:;I:;I:;n:;
:;I	
:;I8
I!I/I
I:;
:;I82.?:;nI<dI4I.?:;n<d.?:;n2<d.?:;2<d.?:;nI2<d.?:;nI2<d&II:;:;
:;I8:;
:;I8 !I":;#:;$:;%
:;I8&
:;I'
:;I8(:;)*!I/+:;n,(
-:;.
:;I82/.?:;n2<d0.?:;n2<d19:;2:;3.?:;nI@?B4:;I5:;I6U74:;I8??19???B:??1;4:;I?<<4:;I?<=!>4:;I?<?4:;I@4:;IA5IB4:;IC4:;I?D.?:;I<E.?:;nI<F.?:;<%$>I:;I:;
:;I8:;	(

:;'II
.?:;'I :;I4:;I.:;'I@?B:;I??1.1n@?B141U1U41??1???B???B1??1.?:;n'I@?B4:;I 4:;I?!.?:;n'I<".?:;'I<#.?:;'<%$>:;I:;
:;I8I.?:;nI@?B	:;I%.?:;@?B%$>:;II:;I:;
:;I	I
!I/:;
:;I8
:;:;
:;I8:;
:;I8''II:;
:;I8
:;I8&I:;:;:;
:;I'<.?:;'I@?B :;I!4:;I"??1#???B$???B1%.?:;n'I<&.?:;'I<%$>I'I:;(:;II	!I/
4:;I?Y9?
/usr/src/debug/cygwin-2.4.1-1/winsup/cygwin/usr/lib/gcc/i686-pc-cygwin/4.9.3/include/usr/i686-pc-cygwin/sys-root/usr/include/w32api/cygwin/src/cygwin/cygwin-2.4.1/cygwin-2.4.1-1.i686/src/newlib-cygwin/newlib/libc/include/machine/cygwin/src/cygwin/cygwin-2.4.1/cygwin-2.4.1-1.i686/src/newlib-cygwin/winsup/cygwin/include/sys/cygwin/src/cygwin/cygwin-2.4.1/cygwin-2.4.1-1.i686/src/newlib-cygwin/newlib/libc/include/syscrt0.cstddef.hminwindef.hbasetsd.h_default_types.hlock.h_types.hreent.h_stdint.hcygwin.h@i???????7?
convert_zscores.ctrec_eval.h?@gz??eJ????vg?u??e-\??gptJ?=2k??u/='g?g?xJ
?Y2k??u/='g?g?xJ
?Y?J?
form_prefs_counts.ctrec_eval.htrec_format.h?@???L?????%????WLWL(?0$ٟ???&?&?O&?N?7???*???+?wJJ?'?/?P'?N??,??'?P,?O??????????%?uuuu??u?.??.?????A??????D???Kuguu?+K?,KQ?J3֠?.?????A??????Dv??Yyu?l???+??K/x4J
?gvYk???????????????I1x"?/'/%???%???؟???f?I1?I1?Jn?J9?==Yyw???;Jμ.e"$?e?:ټ;?1G?:?.?0@??:j?J??xgvYjt?#܁#?O??ud??????J??sJ?gg?gg/ggc???/J?J??sJͻ????&UJ?J
Ylg????????????????*J??????*J??????J?????J?????Jq?J??????J?????Jx?J?==Y5gv??u???u+J?4wtJ?J
=5z??$٠?')J?$??????J?$??&;??J	????&(((J?"??gg?H?(M?(ML?(F	.w?J??g$??u??,?FJ?h$??ggg??/?H?,??0?G?suJJ?/?0?G?sG?"?????Yjj????/?J?@j???/?J?Agu2g?fY?fY<u u?4g=u!uY2gY2=?4???:=;=+=;?J??h??-=)?J???hgW>?'??J?u??hY?/?f(/+J?Ju??hY?/?f(/-J?Ju?@h?tX?YYYYYYYYY???Z?'??J????Y??J???hK???J??@g?ɟ??ɠ?ɠ?ɠ?ɠ?ɠ?ɠ?ɠ?ɠ?ɠ?ɠ?ɠ?ɠY?F?
form_res_rels.ctrec_eval.htrec_format.h?@@:
t?J??WL(?0$ٟ????$?$?N????J?$???WJ?$?I????J??/?K/?????xJ?'ڟ&?/??uF??P??M\?ZZ?ujJJ?/?uG???????$?/??-?A̻??0K?K??w?J????/???J?JYl=???u?uv?u?uY4g=u!uY4gY3g?ɟ??ɠ?ɠ?ɠY?I?
form_res_rels_jg.ctrec_eval.htrec_format.hdJ@:
t??????WL(?0$ٟ?????,?1N$?N????J?$???WJ?$?I????J
?u?=-Z/??G~J?e)ڟ.?g=-Z???F??P??M\?ZZ??jJJ?/??G?/=="%?$?/?=7?y0)?,?0KuKv?w<J?0=??Y?>J??J?֟??Yl=???u?uv?u?uY4g=u!uY4gY2g?ɟ??ɠ?ɟ??ɠ?ɠY5/?
trec_eval.hformats.cSZ?
/usr/includeget_prefs.ctrec_eval.htrec_format.hctype.h\U@??t?&hsL#??Z?Z٢=?L?u?I??ɟg?/?W??xJ
?["u?=GJ?"?LsL????u?)hZKL?????+?nJJ?h???Ywg?fYu3g?.J?"?.J"???.J?"?.J"???.J?"?.J"???.J?"?.J"???.J?"???.J"??.J?"?v??Y4g?ɠ?ɠ?ɠ?ɠYZ?
/usr/includeget_qrels.ctrec_eval.htrec_format.hctype.h?]@?vt?&hsL#??Z?Z٢=?L?u?I????g?/+W??yJ	?["u?;GJ?"?LsL????u?(hZKLu???&pJJ?h???Yig?fYu3g?.J?"?.J"???.J?".J"??L.J?"?.J"???.J?"???.J"??.J?"?v??Y3g?ɠ?ɠ?ɠ?ɠYP]?
/usr/includeget_qrels_jg.ctrec_eval.htrec_format.hctype.h?e@?vt?&hsL#??Z?Z٢=?L?u?I??g?g?/5???yJ	??"׻3K?5h5??s?J?"sLsLsL?????׻&h-ZKLZL!??/?#hZL/?K?aJJ#?-Zh???Yig?fY?fYu3g?.J?"?.J"???.J?"?.J"???.J?"?.J"???.J?"???.J"??.J?"?v??Y2g?ɠ?ɠ?ɠ?ɠ?ɠY)`?
/usr/includeget_qrels_prefs.ctrec_eval.htrec_format.hctype.h?n@?vt?&hsL#??Z?Z٢=?L?u?I??g?g?/5???yJ	??"u?3GJ?"?LsL????u?"hZKL!??K?$YnJJ?h???Yig?fYu3g?.J?"?.J"???.J?"?.J"???.J?"?.J"???.J?"???.J"??.J?"?v??Y2g?ɠ?ɠ?ɠ?ɠY???
/usr/lib/gcc/i686-pc-cygwin/5.3.0/include/usr/include/sys/usr/include/cygwin/usr/includeget_trec_results.cstddef.h_types.htypes.htypes.htrec_eval.htrec_format.hctype.h?v@7w|{?&gC٠?٠?g?Z٢=?L?u?I????g?[???"??02W??rJ?["u?;GJ???LsL????u?(hZKLu?$?*oJJ?[???Yig?fYu3g??.J"???.J?".J"??J?"?.J"???.J?".J"??J?"?.J"???.J?"?u?.J"??.J???Y2g?ɠ?ɠ?ɠ?ɠYK?
/usr/includeget_zscores.ctrec_eval.hctype.h?@?v
t?&hsL#??Z?Z٢=?L?u?I??g?g?/5???yJ	??"u?3GJ?"?L???u?"hshL!??K??qJJ?ʃ??Yig?fYu3g?.J?"?.J"???.J?"?.J"???.J?"?.J"???.J?"???.J"??.J?"?v??Y3g?ɠ?ɠ?ɠY?C?
m_11pt_avg.ctrec_eval.htrec_format.h?@0g?Z?(?%????NJ??/Ifzggg???g?/?Hf\wJJf
f/?Hk;>?Y}??
m_binG.ctrec_eval.htrec_format.h@?@&l%?uY?gK#??J
?g??Y?@?
m_bpref.ctrec_eval.htrec_format.h0?@'juZ%???J?uuu?["K[-LiMh??=??;-?k?J?ugY?<?
m_G.ctrec_eval.htrec_format.h??@?
?%?ɠYYYY?f?.=uZ?גK0uKf?.8g?^ɑ׮?ZW˟gDJ?Zב?ZWʟ<tJ?ZK0uKf?.8g?^ɑ?5qJ???Y2uux???&?u?'?=FJ???J?>?"!/?uJJ?$???J??Yig'2h?=J?/?C?
m_gm_bpref.ctrec_eval.htrec_format.h?@&xuZ%???J?uuu?["K[-LiMh??=??;-?k?J?u!Xf?????Y?A?
m_gm_map.ctrec_eval.htrec_format.h̓@'z%?uY?gKJ?g!Xf?>Y?@?
m_infap.ctrec_eval.htrec_format.h??@*jZ%?uuu?["K[-L?Mg?g???0=?-,l?J?uhY?J?
m_iprec_at_recall.ctrec_eval.htrec_format.hD?@-g?%????NJ??/Ifzggg???g?/?Hf\wJJf
?/?Hk?Yv>?
m_map.ctrec_eval.htrec_format.hD?@,l%?uY?gKJ?g??Y?D?
m_map_avgjg.ctrec_eval.htrec_format.h?@'l%??uY?"KJ5g%?uXJ??3Y?B?
m_map_cut.ctrec_eval.htrec_format.hH?@*g?wuZ%?u??[Y?>#gKu?J?(/Y?>G?Y???
m_ndcg.ctrec_eval.htrec_format.h??@??%?ɠYY?f?.=v??Z?K0uKf?.8g?^??o6J?Z?Y?5xJ
?ZK0uKf?.8g?^??5qJ???Y2uux???&?u?'?=FJ???J?>?"!/?uJJ?$???J??Yig'2h?=J?/?C?
m_ndcg_cut.ctrec_eval.htrec_format.h??@,??{%?Y?[?Z??Y??q?J?
 /????E	?u?uYuK/Krf$?[?5??#g?j(JpX
<	 /?5???zJ	 Y	QA?
m_ndcg_p.ctrec_eval.htrec_format.h,?@??%?ɠY??Yg#??x?J??uY?K/uKfU4??g?'?p(J?ue@?Y2uux???&?u?'?=FJ???J?>?"!/?uJJ?$???J??Yig'2h?=J?/?C?
m_ndcg_rel.ctrec_eval.htrec_format.h?@??Yuy%?ɠYY?f?.=v??Z?K0uKf?.8g?^?K???L?h=J??Z?Y???L?<rJ?ZK0uKf?.8g?^?K??5oJ??/???Y2uux???&?u?'?=FJ???J?>?"!/?uJJ?$???J??Yig'2h?=J?/zP?
m_num_nonrel_judged_ret.ctrec_eval.htrec_format.h԰@$i%v;h;?;?;BYP/?
m_num_q.ctrec_eval.hL?@%=YY5g???gY?B?
m_num_rel.ctrec_eval.htrec_format.h??@%i%v?Y6j??u?+??KHJ?+????HJIJ?(ktJ?gYgF?
m_num_rel_ret.ctrec_eval.htrec_format.h??@$j%v?YcB?
m_num_ret.ctrec_eval.htrec_format.h?@#j%v?Y?<?
m_P.ctrec_eval.htrec_format.h\?@2g?wv%??[Y?>#gvJJ?&/Y?>G?Y?B?
m_P_avgjg.ctrec_eval.htrec_format.hx?@,u?%??uu?[/?Z#"vJJ?&//?ZGp?J???=I?Y?F?
m_prefs_avgjg.ctrec_eval.htrec_format.hh?@+l%?Z??????gx?J????Y?J?
m_prefs_avgjg_imp.ctrec_eval.htrec_format.h??@,l%?Z?????gy?J????Y?J?
m_prefs_avgjg_ret.ctrec_eval.htrec_format.h??@-l%?Z???g?J	????YN?
m_prefs_avgjg_Rnonrel.ctrec_eval.htrec_format.hh?@:{%?Y?????????((?1m?J????Y
fuux??????????hEI1WI1?JJ?????hEI?WI1WJ???u???KJ?#h??"L???L?cJ????L?cJ????L?cJo?J???"L???L?cJ????L?cJw?J?h???R?
m_prefs_avgjg_Rnonrel_ret.ctrec_eval.htrec_format.h??@>{%?Y??????((?1p?J????Y
fuux?????????hEH1ZWVH1ZW?JJ????hEH?ZWVH1ZW?J???v??KJ?#h??"L???L?cJ????L?cJw?J?h???N?
m_prefs_num_prefs_ful.ctrec_eval.htrec_format.hH?@%k%vu???J?gY?R?
m_prefs_num_prefs_ful_ret.ctrec_eval.htrec_format.h??@$k%vu??J?gY?O?
m_prefs_num_prefs_poss.ctrec_eval.htrec_format.hp?@#k%?u????J?gY?E?
m_prefs_pair.ctrec_eval.htrec_format.h0?@(iYv%????eZK??e??J?J????eZK??e??J?J????eZGJ?J	?g??Y?I?
m_prefs_pair_imp.ctrec_eval.htrec_format.h??@*iYv%????eZK??e??J?J????eZK??e??J?J?g??Y?I?
m_prefs_pair_ret.ctrec_eval.htrec_format.h??@,iYv%????eZK??e??J?J?g??Y?E?
m_prefs_simp.ctrec_eval.htrec_format.h ?@(k%????????J?g??Y?I?
m_prefs_simp_imp.ctrec_eval.htrec_format.h,?@*k%???????J?g??Y?I?
m_prefs_simp_ret.ctrec_eval.htrec_format.h?@+k%v????J?g??Y?A?
m_recall.ctrec_eval.htrec_format.h??@*g?wv%?u??[Y?>#gvJJ?&/Y?>G?YyE?
m_recip_rank.ctrec_eval.htrec_format.h??@$j%v?eJ?"??>Y?@?
m_rel_P.ctrec_eval.htrec_format.h??@(g?wv%?u??[Y??<->#gvJJ?&/=Y??<->F?Y?D?
m_relstring.ctrec_eval.htrec_format.h??@2g?%?)fw'ؠ??fgf?fgfgKxJ
??Y2u???Ykg??ʻ??Y?@?
m_Rndcg.ctrec_eval.htrec_format.h??@??^%?ɠu?YY?f?.=guv??K0uKf?.8g?`u??LhZ???f=J??Z?Y?5xJ
???MZK0uKf?.8g?`u??Lh??5hJ??Y2uux???&?u?'?=FJ???J?>?"!/?uJJ?$???J??Yig'2h?=J?/w@?
m_Rprec.ctrec_eval.htrec_format.h??@+l%vu?gv?gHJ??>Y?E?
m_Rprec_mult.ctrec_eval.htrec_format.hD?@-g?%????NJ?gh?/Y?>Gf{g???g/?Hf\?wJJf
??Y?K?
m_Rprec_mult_avgjg.ctrec_eval.htrec_format.hX?@0u?%????????J?,h?//,ZGf'?Y??g/5Hf\(wJJfp?J???=I?^?YN/?
m_runid.ctrec_eval.hx?@#=?Y2g?'Ym@?
m_set_F.ctrec_eval.htrec_format.h??@)g?%vu??Z??Z?s?YsB?
m_set_map.ctrec_eval.htrec_format.h|?@&i%vtu??0??-,BYf@?
m_set_P.ctrec_eval.htrec_format.h?@'i%vu??-?YkE?
m_set_recall.ctrec_eval.htrec_format.hl?@$i%vu??-@Y}D?
m_set_rel_P.ctrec_eval.htrec_format.h??@'i%vtu????-@Y?B?
m_success.ctrec_eval.htrec_format.hX?@*g?wv%??[YfJ->#gvJJ?&/YfJ->G?YvB?
m_utility.ctrec_eval.htrec_format.hh?@4g?%??(???0?0u;?+BYx??
m_yaap.ctrec_eval.htrec_format.hx?@&z%?uY?gKJ?!u??Yr0?
meas_acc.ctrec_eval.hD?@=Y4=%;ZY4w?/?WJ?Y}0?
meas_avg.ctrec_eval.h?@=Y4g???g-Y4v???g?7J?Ymv???gg?#!??Yw1?
meas_init.ctrec_eval.h??@)=?Y3je)?x???Y3je)?x???Y3y??/???yJ?%??)???g?v?pJ޻gYky??/???yJ?%??)???g?v?pJ޻gY
fw??/???yJ?	 e)????!???gu?Yku??/???yJ?	 e)????!???gu?Yl{u??IJ?s?e?????gu??g#?J?#"?(v?+(qJ?Yj{u??IJ?s?e?????gu??g#?J?#"?1(?J?Yjzu??IJ?s?e???gu??g#?J?#??Yk|uK??K?h??K0tJJ? gأs?e???gu??g!??g"w?J?"??Yj=?1=?u!uY2g=u(=2!=u(=1wgY?=u(=?8?
meas_print_final.ctrec_eval.h??@=Y5g????/eJtX?Y4uע????9???Ymw??T?V/eJtX?y?J	 ??Ylg????/eJtX????Y?9?
meas_print_single.ctrec_eval.h??@ =Y5g?????????Y4u?????9???Ylv??U?W?U?Wy?J Y60?
trec_eval.hmeasures.c/J?
/usr/includetrec_eval.cgetopt.htrec_eval.h$A??]????????????????#???1>?D.??Z"??[?#??Z?Z/Z?ZgZ?Z/Z/0?0?1?0?1??[?????''?%5#y??	<#???(5#y??	<#?#??'???(?(?Y????y??
 ???v????%???%?#"@?I??????rv:=su?%??w?? ??????"?ʼ??I^??o?? F?Dt ??/@???wgYji??+J??uY2jg.J?????w???/!uG?wtJ??2hvL?#J?"???#J?"????!w?=?UJ	?g?Y2i?J?????/@i?#=?J?#?#=qJ?#?=rJ???vY?[?
/usr/lib/gcc/i686-pc-cygwin/5.3.0/includeutility_pool.cstddef.htAg?u?Y???3g?u?Y??0?uv^?
/usr/src/debug/cygwin-2.4.1-1/winsup/cygwin/libcygwin_crt0.ccrt0.h`A=gfQ?
/usr/src/debug/cygwin-2.4.1-1/winsup/cygwin/libpremain0.c?A
fQ?
/usr/src/debug/cygwin-2.4.1-1/winsup/cygwin/libpremain1.c?A
fQ?
/usr/src/debug/cygwin-2.4.1-1/winsup/cygwin/libpremain2.c?A
fQ?
/usr/src/debug/cygwin-2.4.1-1/winsup/cygwin/libpremain3.c?A
???
/usr/src/debug/cygwin-2.4.1-1/winsup/cygwin/lib/cygwin/src/cygwin/cygwin-2.4.1/cygwin-2.4.1-1.i686/src/newlib-cygwin/newlib/libc/include/machine/usr/lib/gcc/i686-pc-cygwin/4.9.3/include/cygwin/src/cygwin/cygwin-2.4.1/cygwin-2.4.1-1.i686/src/newlib-cygwin/winsup/cygwin/include/sys/cygwin/src/cygwin/cygwin-2.4.1/cygwin-2.4.1-1.i686/src/newlib-cygwin/newlib/libc/include/sys/usr/i686-pc-cygwin/sys-root/usr/include/w32api/usr/src/debug/cygwin-2.4.1-1/winsup/cygwin/usr/lib/gcc/i686-pc-cygwin/4.9.3/include/c++/i686-pc-cygwin/bits./usr/lib/gcc/i686-pc-cygwin/4.9.3/include/c++_cygwin_crt0_common.cc_default_types.hstddef.hlock.h_types.h_stdint.htypes.hstdarg.hstrace.hminwindef.hwinnt.hreent.hminwinbase.hntdll.hcygwin.hcygwin-cxx.hthread.hcrt0.hc++config.hstring.hwinsup.hcygtls_padsize.hsecurity.hcygwait.hglobals.h	libloaderapi.hnew
?A?t=??h<U.wuu?<h.data?.bss|.rdata?2d?p

???X	 ??	??????.file???gm_11pt_avg.cY?m8?w .textw=.data??
.bss?.rdata?28?w
?		????x	 ????
	?????	8.file???gm_binG.c?@y .text@y?.data?(.bss?.rdata(6????
???
???	 ???? 
	?????	8.file??gm_bpref.c?0z .text0z?.data?(.bss?.rdata?7??v?
	?????	 ?"??)
	???
8.file)??gm_G.c????{ ?c ?u? ??? .text?{8.data (	.bss?.rdata?9???
?????	 ????2
3???<
?.fileC??gm_gm_bpref.c?? .text??.data`(.bss?.rdata??.?
(	?????	 ?? ??e
	?0???
@.file]??gm_gm_map.c	̃ .text̃?.data?(.bss?.rdataA??V?
?????	 ?y!??n
	?D??<.filew??gm_infap.c	?? .text???.data?(.bss?.rdata?B??)?
"	????8	 ?"??w
	?X??X8.file???ga	Y '	xH	D? .textD??.data ?
.bss?.rdata?E?K?
?		????X	 ??"???
	?l???8.file???gm_map.cu	D? .textD??.data?(.bss?.rdata?G????
?????x	 ?x#z??
	?????8.file???gm_map_avgjg.c?	? .text?G.data (.bss?.rdatapK????
-	?r???	 ??#???
	????8.file???gm_map_cut.c?	`?	??	H? .textH?I.data`h
.bss?.rdata\M????
P		?Y???	 ??$???
	????88.file??gm_ndcg.c?	??	?? ?? ? ? ?O? .text??.data?(	.bss?.rdata0O??8?
h?R??	 ?%???
3????p?.file??gm_ndcg_cut.c?	 ?	D
?? .text???.data h
.bss?.rdata?Q????
?		?b???	 ??&???
	????
@.file;??gm_ndcg_p.c?	?
,? ??? ??? ?Ù .text,??.data?(	.bss?.rdataxT??(?
V?[?	 ??'U??
&????P
?.fileY??gm_ndcg_rel.c?	?(
? ?J? ?\? ??? .text??.data?(	.bss?.rdataHW}?~?
??k?8	 ??(??
??????
?.files??gY
:
Ԡ .textԠw.data (.bss?.rdata?]??

??{??X	 ??*~?E
	????8.file???gm_num_q.cs
L? ?
l? .textL?_.data`(.bss?.rdata?^???
<
?b??x	 ?+T?N
? ???X.file???gm_num_rel.c?
?? ?
? .text??.data?(.bss?.rdata?_9?'
?
?I
??	 ?q+??c
.?4?? X.file???gm_num_rel_ret.c?
?? .text??V.data?(.bss?.rdata?`???%
??S???	 ?-,k??
	?H??x8.file???gm_num_ret.c?
? .text?V.data (.bss?.rdata?a???.
??:???	 ??,g??
	?\???8.file???gm_P.c?	`?
??
\? .text\?.data`h
.bss?.rdata?bc?,7
-		?! ???	 ??,???
	?p???8.file??gm_P_avgjg.c?	??
x? .textx??.data?h
.bss?.rdatafE?Y@
?		?!??	 ??-???
	???? @.file/??gm_prefs_avgjg.c)h? .texth?$.data`(.bss?.rdata\h???I
?
?"??8	 ?/.???
????`8.fileI??gW>?? .text??
.data?(.bss?.rdatak??T
?
??"??X	 ??.???
?????8.filec??g?k?? .text???.data?(.bss?.rdata(n???_
?
??#??x	 ?I/???
?????8.file~??g??h? ??? .texth?O.data 	(.bss?.rdata$q??j
l??$"??	 ??/?
7????h.file???g???? ?x? .text???.data`	(.bss?.rdata???
	?(???8.filee??g?O????X? .textX?.data??
.bss?.rdata????(
?		??5???	 ??>??
	????	 ?nC|?Q
	????$<.fileq??gmeas_acc.c	D? N? +?? .textD??.data.bss???
e?^???	 ??Cv?Z
???`|.file???gmeas_avg.c>? V? ju? ??? .text??.data.bss?.rdata`??F?
	?A@?8	 ?`D??l
$?????.file???gmeas_init.c??? ??? ?7? ??? ?9? ?? 5? R^? de? w>? ?|? ?K? ?^? ??? ??? ?b? .text??HC.data.bss?.rdatah???N?
RK?BA??X	 ??D{??
{?,??|\.file???gmeas_print_final.c??? ??? _? /,? J? .text???.data.bss?.rdata??4???
	??B??x	 ?\G??@????.file???g?i?? ??? ??? ?F? .text??F.data.bss?.rdata0?/???
???C???	 ?8H??T????.file??gmeasures.c??? ?`/?.text$?.data??.bss?.rdata`?X?|?
f??D???	??H:?h?.file.??gtrec_eval.c??J?_usage?_main$? Xk?? ~[? ??? ??? ? _cleanupg .text$?P?.data? .bss?.rdata??3???
?#?uE???	 ??I3?
)?|??8.fileV??gutility_pool.c?t ?? .textt?.data.bss??s?
W?G}??	 ?;L??4
????DX.textH.idata$7\.idata$5.idata$4<.idata$6?.textH.idata$7?.idata$5?.idata$4?.idata$6?.textH.idata$7?.idata$5?.idata$4?.idata$6?.fileZ??gfake.textH.file^??gfake.textP.fileb??gfake.textX.filef??gfake.text`.filej??gfake.texth.filen??gfake.textp.filer??gfake.textx.filev??gfake.text?.filez??gfake.text?.file~??gfake.text?.file???gfake.text?.file???gfake.text?.file???gfake.text?.file???gfake.text?.file???gfake.text?.file???gfake.text?.file???gfake.text?.file???gfake.text?.file???gfake.text?.file???gfake.text?.file???gfake.text?.file???gfake.text?.file???gfake.text?.file???gfake.text.file???gfake.text.file???gfake.text.file???gfake.text.file???gfake.text .file???gfake.text(.file???gfake.text0.file???gfake.text8.file???gfake.text@.file???gfake.textH.file???gfake.textP.file???gfake.textX.file???gcygwin_crt0.c?` .text` .data.bss????
|??G??		 ??Lz???2??4.file	??gpremain0.c
? .text?.data.bss??F?
7?=HO?0		 ?@Mj???2??,.file$	??gpremain1.c? .text?.data.bss??}?
7??HO?P		 ??Mj??2??,.file:	??gpremain2.c,? .text?.data.bss????
7??HO?p		 ?Nj?@?2?(,.fileP	??gpremain3.c=? .text?.data.bss????
7?*IO??		 ?~Nj?t?2?T,.filel	??gpN? .text?,?.data .bss??"?
?"?yI?es??		 ???N??B
???2??@.file?	??gdll_entry.c?? ???????  ??? .text?).data .bss???
??VM?es3??		 ?0??S????2???.file?	??gdll_main.cc?  .text .data .bss???
??=O???		 ?}T???2?\ ,.file?	??g!0 .text0.data .bss??D

??O(?
	 ?*Ur?D?2?? ,.text@.idata$7p.idata$5(.idata$4P.idata$68.file?	??gfake.text@.file?	??gfake.textH.file?	??gfake.textP.file?	??gfake.textX.file?	??gI6` _u.25305.text`4.data .bss??Q
X
??O??0
	 ??Ut?W
?x?2?? @.file?	??gfake.text?.text?.data .bss?.idata$7,.idata$5?.idata$4.idata$6&.file?	??gfakehnamefthunk?.text?.data .bss?.idata$2.idata$4.idata$5?.file
??gfake.text?.data .bss?.idata$4.idata$5?.idata$78
.file
??glibgcc2.c.text?.data .bss???#
???Qu?P
	?X????.file-??gcygming-crtend.c]? l y? .text?.data .bss??? <.jcr ??	?????.text.data .bss?.idata$74.idata$5?.idata$4.idata$6L.text.data .bss?.idata$70.idata$5?.idata$4.idata$6:.text.data .bss?.idata$7(.idata$5?.idata$4.idata$6.idata$2.idata$5.idata$4<.idata$7`.idata$5.idata$4@.idata$6?@feat.00??.idata$7d.idata$5.idata$4D.idata$6@feat.00??.idata$7h.idata$5 .idata$4H.idata$6@feat.00??.idata$7l.idata$5$.idata$4L.idata$6(@feat.00??.idata$7t.idata$5,.idata$4T.idata$6H@feat.00??.idata$7x.idata$50.idata$4X.idata$6T@feat.00??.idata$7|.idata$54.idata$4\.idata$6`@feat.00??.idata$7?.idata$58.idata$4`.idata$6l@feat.00??.idata$7?.idata$5<.idata$4d.idata$6t@feat.00??.idata$7?.idata$5@.idata$4h.idata$6|@feat.00??.idata$7?.idata$5D.idata$4l.idata$6?@feat.00??.idata$7?.idata$5H.idata$4p.idata$6?@feat.00??.idata$7?.idata$5L.idata$4t.idata$6?@feat.00??.idata$7?.idata$5P.idata$4x.idata$6?@feat.00??.idata$7?.idata$5T.idata$4|.idata$6?@feat.00??.idata$7?.idata$5X.idata$4?.idata$6?@feat.00??.idata$7?.idata$5\.idata$4?.idata$6?@feat.00??.idata$7?.idata$5`.idata$4?.idata$6?@feat.00??.idata$7?.idata$5d.idata$4?.idata$6?@feat.00??.idata$7?.idata$5h.idata$4?.idata$6?@feat.00??.idata$7?.idata$5l.idata$4?.idata$6@feat.00??.idata$7?.idata$5p.idata$4?.idata$6@feat.00??.idata$7?.idata$5t.idata$4?.idata$6@feat.00??.idata$7?.idata$5x.idata$4?.idata$6$@feat.00??.idata$7?.idata$5|.idata$4?.idata$6,@feat.00??.idata$7?.idata$5?.idata$4?.idata$64@feat.00??.idata$7?.idata$5?.idata$4?.idata$6<@feat.00??.idata$7?.idata$5?.idata$4?.idata$6D@feat.00??.idata$7?.idata$5?.idata$4?.idata$6P@feat.00??.idata$7?.idata$5?.idata$4?.idata$6\@feat.00??.idata$7?.idata$5?.idata$4?.idata$6h@feat.00??.idata$7?.idata$5?.idata$4?.idata$6p@feat.00??.idata$7?.idata$5?.idata$4?.idata$6|@feat.00??.idata$7?.idata$5?.idata$4?.idata$6?@feat.00??.idata$7?.idata$5?.idata$4?.idata$6?@feat.00??.idata$7?.idata$5?.idata$4?.idata$6?@feat.00??.idata$7?.idata$5?.idata$4?.idata$6?@feat.00??.idata$7.idata$5?.idata$4?.idata$6?@feat.00??.idata$7.idata$5?.idata$4?.idata$6?@feat.00??.idata$7.idata$5?.idata$4?.idata$6?@feat.00??.idata$7.idata$5?.idata$4?.idata$6?@feat.00??.idata$7.idata$5?.idata$4?.idata$6?@feat.00??.idata$7.idata$5?.idata$4?.idata$6@feat.00??.idata$7.idata$5?.idata$4?.idata$6@feat.00??.idata$4?.idata$5?.idata$7.rsrc????B????? ?&??5?DH _free? R8d?
_strcmpH s|~????`??
??????L8"??_putc M???e??~???? ? ?????? 
 ,`	M_o?|???
_puts  ?? ???? ? ? ???`,:??__ZdlPv$]??u?	_fputs? _environ??\_atofh ?????????	`_open ?%@4__dll__??D?P??_fwrite? e?_strncpyX r@???X?????????x?@??_memcpy? ?(
?P '??S?B??_memset? q?	?`? ??????_munmap ???$??_fflush? % 	B R e?q?___mainX _fprintf? ?t_lseek? _log2? ??X ???`?8 %?4?_calloc@ FP __fmode?R??e?q????4?????????=?V?d r???`?`?????_realloc8 ?H ??)?:`R??g?T?d_exp? ????$?__end___log? ?@ ??_index?  ?& t9 ?_malloc? J ?X dg { ?? H? ??? ? @??? ??? ?? h? `
!`!D_abort` '!?




© 2015 - 2025 Weber Informatics LLC | Privacy Policy