Southern Hemisphere mid-to-upper tropospheric planetary wave activity is characterized by the superposition of two zonally-oriented, quasi-stationary waveforms: zonal wavenumber one (ZW1) and zonal wavenumber three (ZW3). Previous studies have tended to consider these waveforms in isolation and with the exception of those studies relating to sea ice, little is known about their impact on regional climate variability. We take a novel approach to quantifying the combined influence of ZW1 and ZW3, using the strength of the hemispheric meridional flow as a proxy for zonal wave activity. Our methodology adapts the wave envelope construct routinely used in the identification of synoptic-scale Rossby wave packets and improves on existing approaches by allowing for variations in both wave phase and amplitude. While ZW1 and ZW3 are both prominent features of the climatological circulation, the defining feature of highly meridional hemispheric states is an enhancement of the ZW3 component. Composites of the mean surface conditions during these highly meridional, ZW3-like anomalous states (i.e. months of strong planetary wave activity) reveal large sea ice anomalies over the Amundsen and Bellingshausen Seas during autumn and along much of the East Antarctic coastline throughout the year. Large precipitation anomalies in regions of significant topography (e.g. New Zealand, Patagonia, coastal Antarctica) and anomalously warm temperatures over much of the Antarctic continent were also associated with strong planetary wave activity. The latter has potentially important implications for the interpretation of recent warming over West Antarctica and the Antarctic Peninsula.
INTRODUCTION As part of the Australian Geophysical Observing System (AGOS) the University of Melbourne has established a seismic monitoring network in Victoria's South Gippsland region, in order to better understand natural earthquake occurrence in one of Australia's most seismically active regions. To the end of 2014, the network comprised 8 surface seismic stations and 4 borehole instruments (Figure 1). In addition, the AuScope Seismometer in Schools (AuSIS) program has a stations at Sale and Rosebud. The first deployment of surface instruments occurred in the month prior to the M 5.2 Moe earthquake (2012-06-19:10:56). Following the main shock three additional surface stations (Fish Creek, Willow Grove, Loch) were deployed in 2012. In 2014, Somers was deployed. There are currently 8 surface instruments deployed in Gippsland. In October 2014, 3 borehole instruments were deployed along the Strzelecki hills, at depths of ~130 m. These instruments increase the spatial resolution of the Gippsland array, and offer a chance to monitor microseismicity that would otherwise go undetected and/or poorly resolved. in this note we highlight some of the features resolved by having the subsurface capability. An additional instrument (A346 on map) installed by a mining company at a depth of 1.3 km was integrated into our network, with this data also telemetered in real time.
SUMMARY This report summarises the development of a new Probabilistic Seismic Hazard Analysis (PSHA) for Victoria called the Victorian Earthquake Hazard Map (VEHM). PSHA provides forecasts of the strength of shaking in any given time (return period). The primary inputs are historical seismicity catalogues, paleoseismic (active fault) data, and ground-motion prediction equations. A key component in the development of the Victorian Earthquake Hazard Map was the integration of new geophysics data derived from deployments of Australian Geophysical Observing System seismometers in Victoria with a variety of publicly available datasets including seismicity catalogues, geophysical imagery and geological mapping. This has resulted in the development of a new dataset that constrains the models presented in the VEHM and is also is provided as a stand-alone resource for both reference and future analysis. The VEHM provides a Victorian-focussed earthquake hazard estimation tool that offers an alternative to the nationally focussed 2012 Australian Earthquake Hazard Map . The major difference between the two maps is the inclusion of active fault location and slip estimates in the VEHM. There is a significant difference in hazard estimation between the two maps (even without including fault-related seismicity) due primarily to differences in seismicity-analysis. These issues are described in the discussion section of this report, again resulting in a higher fidelity result in the VEHM. These differences make the VEHM a more conservative hazard model. The VEHM currently exists as a series of online resources to help assist those in engineering, planning, disaster management. This is a dynamic dataset and the inputs will continue to be refined as new constraints are included and the map is made compatible with the Global Earthquake Model (GEM) software, due for release in late 2014. The VEHM was funded through the Natural Disaster Resilience Grants Scheme. The NDRGS is a grant program funded by the Commonwealth Attorney-General’s Department under the National Partnership Agreement on Natural Disaster Resilience signed by the Prime Minister and Premier. The purpose of the National Partnership Agreement is to contribute towards implementation of the National Strategy for Disaster Resilience, supporting projects leading to the following outcomes: 1. reduced risk from the impact of disasters and 2. appropriate emergency management, including volunteer, capability and capacity consistent with the State’s risk profile.
BEYOND THE IMPACT FACTOR. Having been quite versed in the art of research 8 years post PhD, I have been very fortunate to witness a renaissance in publishing in two ways. First, I remember quite well during my PhD training (over 10 years ago), the process of preparing a manuscript for the highest ranked journal, submit, reject, reformat and submit to the next journal, reject, submit.....you get the story. Published manuscripts were usually in print form. The second method utilises the Internet and open access publishing. During that time, impact factor was the key metric in which a publishing house was measured (The agony and ecstasy of the impact factor). This evolved quickly into a measure of a researcher’s performance. There was active push to get manuscripts into journals with a high impact factor as it reflects positively on the authors involved. Over the course of my career, merit in using the impact factor to judge a study was questioned . These days, it is almost a profanity to even consider the Impact Factor as a measure of a manuscript’s value (Randy Schekman’s Piece on Nature, Cell Science and Occam’s Typewriter on Impact Factor). Believe it or not, I have heard a swear jar was enforced at a grant review panel, where whomever mentioned “Impact Factor” would contribute to the swear jar. Okay so Impact Factor is not relevant, so what is a good metric? What is a good measure of a published manuscript for the likes of major granting bodies and prospective employers to judge your performance by? Speaking to many in the field and to many of my mentors, I would say the jury is still out; there really isn’t a good metric to go by, but nonetheless there should be one. In preparing this little piece I have come across Eugene Garfield, it seems he is the pioneer of citation metrics and has written many commentaries on the history of the Impact Factor and on how it has been misused. There have been discussions on other metrics so-called altmetrics where page hits or PDF downloads for instance, are being considered as measures of research impact. Back to my inaugral, first author manuscript, I aimed my submissions to the major journals including _Science_, _Nature_, _Cell_, _Nature Structural Biology_, _Developmental Cell_ each time sacrificing some impact factor points (see Figure [fig:tour]). This was at my PhD supervisor’s guidance as we both thought of the body of research to be paradigm shifting at the time. I quickly won the boomerang award for the manuscript that came back the most times within the group (not exactly the prize worth singing a song about!). In the end, we settled on, what was at the time, a very new publishing house, the Public Library of Science (PLoS). “What is this PLoS my supervisor says with scorn, there is no impact factor and what is this open access business? No business if you ask me; how will they survive if you give everyone open access to their published papers?” I was sold on how PLoS came about with regard to it’s push for online access and publishing at the time. It took a little convincing (meeting the editor of PLoS at the time) for my superviosr to come around who settled on the analogy with the stockmarket, “well the impact factor can only go up with this one”. To make a long story short, my paper was published in _PLoS Genetics_ . It was an interesting finding to me and my supervisor, but wasn’t really to the editors of _Nature_ and _Science_. Today, there are many open access publishers, such as open access pioneers BioMed Central (BMC), who publish over 260 journals spanning all topics of biology and medicine, and paved the way for other publishers, such as PLoS. As a profession, we all have come to realise the benefits of open access. Even the more unscrupulous entities that prey on unsuspecting scientists, have embraced open access (Beall’s List of Predatory Open Access Journals.). Even the likes of _Nature_ and _Science_ have embraced open access. It’s great and makes research more accessible much quicker than conventional print-based avenues. Measures of _impact_ have evolved in the era of open access; however, they are only slowly being adopted by the important funding bodies. In fact it is a requirement by NIH in USA and more recently NHMRC and ARC in Australia, that government funded research should be published as open access.
Imagine an event generating multiple innovations in medical research in one weekend. Imagine an event giving spreadsheet-limited laboratory researchers a taste of what computers can _really_ do. Imagine an event where the powers of coding and analysis are combined with powers of biology, genetics and medicine to convert dry or unobtainable data into ground-breaking medical insight. And imagine it all done in a spirit of frenzied teamwork and friendly competition. This is HealthHack, a 48 hour hackfest in which medical researchers nucleate teams of brilliant, creative people to solve the data-processing problems faced by real researchers and real clinicians. The weekend opens on Friday evening with the problem owners pitching their ideas, problem and proposed solution, to the assembled hackers, followed by an hour in which hackers can approach problem owners with their questions and seek additional details. Hackers then choose which problem they are going to work on, and the assembled teams plan their assault and commence work on their application. Apart from going home (presumably) to sleep, the event runs all weekend. Sunday 4pm is _down tools_, and the teams then present their prototypes to everyone, with prizes for first place, second place, and _spirit of HealthHack_. Below is a sampling of the teams and their creations:
We study a generalised model of self-avoiding trails, containing two different types of interaction (nearest-neighbour contacts and multiply visited sites), using computer simulations. This model contains various previously-studied models as special cases. We find that the strong collapse transition induced by multiply-visited sites is a singular point in the phase diagram and corresponds to a higher order multi-critical point separating a line of weak second-order transitions from a line of first-order transitions.
DATE Friday September 19, 2014 VENUE Talks in securing space within the Unimelb/VLSCI Science Gallery. As plan B we have secured space for workshops and the talks for this date: FOR TALKS IN THE AM 207 Bouverie Street - Theatre 1 FOR WORKSHOPS IN THE PM VLSCI space 207 Bouverie Street - B118 207 Bouverie Street - B113 TARGET PARTICIPANTS All science practitioners in open data open publishing (and those participating at ISBC2014)