Inferring Phylogenies

Jen here – 

Are you interested in understanding how we take morphological data from extinct animals and use them to infer an evolutionary history? We often think of and visualize relationships as trees, this includes your family tree. We have an entire page on Reading the Tree of Life so you can understand how to read and interpret these visualizations. These trees, called phylogenies, can be used as a framework to test different macroevolutionary questions regarding species distribution, paleoecology, rates of change, and so much more! We hope to set the stage to explain how each step is done! 

Before really diving into anything specific, I would suggest you think a little about evolution, phylogeny, and all the basic terminology that builds the foundation for understanding evolutionary theory. I would recommend that you work through The Compleat Cladist: A Primer of Phylogenetic Procedures. This is effectively a workbook that walks you through terms, concepts, and more!

This isn’t meant to be an exhaustive guide but rather set you up to explore the programs and infer a phylogeny! Now that you have learned all you can about your study organism and how to build a character matrix the next step is inferring a phylogeny. 

What does it mean to infer a phylogeny?

Simply, evolutionary scientists can take a data matrix and apply mathematical and statistical models to estimate, or infer, species relationships to generate a phylogeny (evolutionary history). In paleontology, the data are generated by an individual’s understanding of homologous characters in the group and are inherently biased to their expert knowledge. Homology is the similarity due to inheritance from a common ancestor. As such, the researcher is presenting a phylogenetic hypothesis for the group.

It is  important to understand the purpose for pursuing any scientific approach. Why paleontologists should pursue building and inferring phylogenies  is well described by Brian O’Meara in his PhyloMeth video on Why build phylogenies? In essence, tree topologies not only tell us about how organisms are related to one another but they can be used as a framework for a variety of macroevolutionary approaches. 

To get an idea of the basics of tree space, please watch this video, Be afraid of tree space, by Brian O’Meara to get you excited about trees.

What are the methods?

These are several of the primary methods currently being used in phylogenetic paleobiology. There are certainly more methods and we encourage you to explore and learn on your own!

Maximum Parsimony

Parsimony, similar to Occam’s razor, suggests that the simplest explanation that fits the evidence is the best. Applying this logic to evolutionary trees means that the best inference or hypothesis is the one that requires the fewest evolutionary changes – or character changes across branches. 

Software: PAUP*, TNT

More reading:

Maximum Likelihood

Likelihood methods provide probabilities of the data given a model of their evolution. The more probable the data given the tree, the more the tree is preferred overall. Because the model is chosen by the user, this method can be employed for a variety of situations. 

Models of evolution in paleobiology include: Jukes Cantor (JC), Felsenstein (F81) but there are many others. Here is an entire chapter on Selecting Models of Evolution by David Posada

Software: PAUP*, RAxML, see Bayesian software list, you can use those as well. 

More reading: 

Bayesian Estimation

Similar to Maximum Likelihood, Bayesian estimation is based on the probabilities of the data given a model of their evolution with the addition of prior beliefs.

Software: RevBayes, MrBayes, BEAST

More reading: 

How do you select a method?

Why not try them all? Paleontology has been slow to adapt the statistical models to better suit our character data and there are many mindsets that are stuck on ‘this is the best way’. However, until you attempt and try each method it is hard to say one is ‘better’ than the other. Some methods may provide a route that is more closely aligned with how your clade evolved through time. Maybe one is more flexible for your dataset, maybe you get the same answer with multiple methods, or maybe you realize something new about your dataset from running multiple scenarios. 

More reading on support and comparing methods: 

General resources and further reading: 

Subscribe to the PhyloMeth YouTube channel and watch pervious lectures and discussions and different aspects of phylogenetic methods.

Paris Agreement 101

Shaina here –

On February 19, 2021 the United States officially rejoined the Paris Agreement. This is an important shift in US climate policy so let’s go over what it means and what the Paris Agreement is! 

What is the Paris Agreement?

It is an international agreement to address climate change under the auspices of the United Nations Framework Convention on Climate Change (UNFCCC). The stated goal is to keep the rise in global mean surface temperature to below 2℃ and ideally below 1.5℃. The agreement was adopted in 2015 at the 21st Conference of the Parties (COP) to the UNFCCC and agreed to by 196 countries.

What is the history of the Paris Agreement?

The formal history within the UN began in 1992 with the creation of the United Nations Framework Convention on Climate Change. The UNFCCC has established the vague goal of reducing greenhouse gas emissions to prevent ‘dangerous anthropogenic interference’ (DAI) with the climate. Over the years there were many efforts that took place under the UNFCCC to achieve this, such as the 1997 Kyoto Protocol which called for binding emissions reductions for certain countries over a short time period. One of the main issues with trying to avoid DAI is that what defines danger has different meanings for different people in different places. This meant that finding a goal that diplomatic representatives from all involved countries could agree on was rather challenging. A long and meandering path led to the decision to adopt the 2℃ (and hopefully 1.5℃) temperature target, and eventually to the Paris Agreement.

The US involvement in the process that led to the Paris Agreement is very complex. As the world’s largest historic greenhouse gas emitter the US had a lot of power during negotiations. Any international action aimed at addressing climate change must have the involvement of large emitters in order to be successful, however large emitters became that way through reliance on fossil fuels— and relatedly slavery and colonialism— and thus have an interest in seeing the use of them as an energy source continue, despite the urgent need for production to decrease. US negotiators worked to ensure that rather than avoiding binding emissions reductions the agreement instead had self defined commitments, and also that it avoided requiring things like liability for loss and damage resulting from climate disasters.

How does it work?

The Paris Agreement does not require binding emissions reductions meaning that counties are not actually required to reduce emissions by a certain amount at a certain time, nor are they required to tie their plans to address climate change to their historic emissions. Rather countries are only bound to participate in the process outlined in the agreement. That process consists of several steps. First, countries each come up with their own individual plans, called Nationally Determined Contributions (NDCs) for how they want to address climate change. These plans can be a combination of mitigation, adaptation, finance, and technology transfer. Then every 5 years they reassess and hopefully ramp up their action plans. Ideally each iteration brings them closer to net zero emissions by mid century (the term net here gives a ton of wiggle room for things like market mechanisms that may or may not actually lead to emissions reductions).

How is it working out?

To be honest, rather poorly so far. It has been five years since the Paris Agreement was ratified and during that time emissions, greenhouse gas concentrations in the atmosphere, and temperatures have continued to rise. While there was a slight decline in emissions in 2020 due to the COVID-19 pandemic (Le Quéré et al 2020), that decline was not a result of countries taking action on climate change, but rather of the emergency lockdowns. The pledges countries have so far submitted would put us on track for around 3°C of warming by the end of the century. The annual COP meetings are where negotiations for Paris Agreement implementation happen, however the COP meeting that was supposed to take place at the end of 2020 was cancelled (youth held their own in its place). Countries were still required to submit updated NDCs by the end of 2020 and then negotiations will continue at COP26 in November.

What does the Paris Agreement say about climate justice?

To be honest with you, dear reader, this part irritates me. There is only one mention of climate justice in the Paris Agreement and it reads: “noting the importance for some of the concept of “climate justice”, when taking action to address climate change”. Climate justice is a term used to encapsulate the many ways that a changing climate is related to sociopolitical inequality across many scales- this can include the ways climate impacts disproportionately impact marginalized populations, the ways historic emitters have had an outsized contribution to creating the problem, and much more. In my opinion, and I am sure many of you would agree, justice is one of the most fundamental, if not the most fundamental, issue at play in the climate crisis. But it is only mentioned in passing here and as only being important “to some”. Many scholars have addressed shortcomings with the Agreement with respect to climate justice (I wrote a chapter of my own dissertation that will add to this body of knowledge), however despite its shortcomings and lack of robust consideration of justice the Agreement is currently the best hope we have for a coordinated international response. And we desperately need that. So this is where the general public can play a large role- we can advocate for policies in our countries and communities that will center justice as a way of bringing this concept to the forefront of the conversation.

What happens after the US rejoins?

The Biden administration will need to submit a new NDC with a renewed pledge. The pledge that was submitted under the Obama administration was considered ‘insufficient’. Then the Trump administration withdrew from the Paris Agreement (moving us into ‘critically insufficient’ territory) and worked to undermine climate action at every opportunity with numerous environmental policy rollbackss, deregulations, and anti-sciencee rhetoric. So Biden will need to submit something truly ambitious, and much stronger than what was done under the Obama administration. It will be important that they not only make an ambitious plan but that they show immediate progress towards justice centered emissions reductions. Their NDC will likely be based around Biden’s climate plan, which does look ambitious, and what they submit to the UNFCCC will need to be compatible with giving us the best possible chance of staying below 1.5℃ of warming in order to show that they are fully committed to justice and climate action. 

Rejoining the Paris Agreement is a necessary step for the US to get back on track with the international effort to address climate change. However we will need to watch closely over the next few months to see what the submitted NDC looks like and what concrete steps are being taken immediately to put those plans into action. 

For now, let’s celebrate this win and do all we can to ensure that this is successful!

References:
Le Quéré, C., Jackson, R.B., Jones, M.W. et al. Temporary reduction in daily global CO2 emissions during the COVID-19 forced confinement. Nat. Clim. Chang. 10, 647–653 (2020). https://doi.org/10.1038/s41558-020-0797-x

The Scientific Process: What is “Peer-Review”?

Kristina here –

Today, humans have access to more information than at any other time in human history, all at the tips of our fingers with a quick Google search, or a “Hey, [insert AI name here]”. While equal access to the internet and information technology is beyond the scope of this post, cell phones, tablets, and laptops, have made it easier than ever to quickly look up information. Yet with this technology has come a huge surge in wide-spread misinformation, making it difficult to know whether you can trust the information you find. Pretty much anyone can post whatever they want, and pass it off as “fact”. How then can the average person determine whether what they’re reading is actually credible and factual? Furthermore, if you see something that says “scientists disagree on X topic”, who should you believe? Contrary to what you might think, not all viewpoints are created equal, and both scientists and the average person can be guilty of confusing “opinion” and “fact”. This is where the “peer-review” process comes in to help.

So what is “peer-reviewing”?

Most people hearing “peer-review” assume it is a good thing (and this is certainly true) but what does “peer-review” mean? Essentially, peer-review is an integral part of the scientific process, and what helps separate “opinion” and “fact”. It is what scientists use to make sure that their research is as thorough, accurate, and factual as possible. In general, scientists do not consider something trustworthy or credible unless it has gone through some kind of peer-review process.

How does peer-review work?

A scientist or group of scientists will first go about conducting research. They will ideally do background reading to make sure they understand what is already known about the topic, and where there might be gaps in our knowledge. They will then design an experiment or test, collect data, and analyse that data. The ultimate goal of science is to try and refute a null hypothesis (e.g., all apples are red). We must prove beyond a reasonable doubt that something is different from the null (what has been previously determined) (e.g., some apples are green). If we can’t prove otherwise, and/or the more scientists that run their own tests and come to the same conclusion, the stronger our hypothesis is, or the closer it is to “the truth” (e.g., apples can be different colours).

Once scientists are finished collecting and analysing the data, and have come to a conclusion (e.g., refuted, or failed to refute a hypothesis), they will write a paper reporting their findings. See Sarah’s post on how to write a scientific paper here. The authors then submit the paper to a peer-reviewed journal, usually one that has been selected based on the topic or audience of the journal. The submitted paper is sent to an editor at that journal, who then decides if the paper is appropriate for their journal. If the paper “passes” this first test, the editor will then send it out to at least two experts in that topic.

How are the peer-reviewers selected?

Usually, journals request that authors include anywhere from 2 – 10 names of experts that know enough about the topic to provide sufficiently thorough critiques of the paper. Authors cannot include close colleagues or collaborators, as this could create bias (e.g., your friend is more likely to give you a pass, even if you don’t deserve it). Editors can opt to choose as many or as few people as they want from the authors’ list. Ideally, editors will also find at least one person not on the authors’ list that is an expert on the topic. Authors may also include a list of people they don’t want to review their papers, but they must have a good reason (e.g., “this person doesn’t agree with me” is not an acceptable reason as critical reviews are important to ensure scientific rigor. But, “this person has been openly hostile towards me” would be – some people can be jerks and block good science in peer-review). If you have too many people that you don’t want to review your paper, that sends up red flags to editors, however, so including people on a “no-review” list shouldn’t be taken lightly by authors, and should only be done when absolutely necessary.

The editor then sends the paper to at least two of these reviewers. If the reviewers accept, they then have about 2 – 4 weeks to evaluate the paper. It’s important to note that editors do not usually review the papers themselves (unless they happen to be an expert in that topic), because, especially for larger journals, the editor is unlikely to know enough about the topic to give sufficiently thorough feedback (e.g., a vertebrate palaeontologist won’t review an invertebrate palaeontology paper, and vice versa). 

Peer-reviewing a scientific paper

If you are the reviewer, your job is to go through the paper and evaluate the science independently. Your comments should stick to the science and presentation of the science, and you must refrain from unnecessary criticism of the authors. For example, “this is a poorly written paragraph” is not helpful or appropriate. Instead, you should point out where you didn’t understand what was written, and why. Reviewers typically read the paper over several times to make sure they understand what the authors are trying to test, then evaluate whether the experimental design, methods, and analyses of the data were sufficient to test the hypothesis. Often, reviewers will analyse the data themselves to make sure they find the same things as the authors. Sometimes, if the reviewer feels that the methods or analyses were insufficient, they will suggest that the authors try other analyses that will more accurately test their hypothesis. This is one of the most common types of reviewer feedback. 

If the methods and analyses all hold up to scrutiny, the reviewer will then make sure that the interpretation of the data (included in a paper’s “Discussion” and “Conclusions” sections) matches the results of the analyses. Another common type of feedback from reviewers occurs when authors overstate (or sometimes understate) their conclusions (e.g., the authors may claim their paper proves x, but their results might only be applicable under very specific circumstances). A good reviewer will make sure that all of the claims made by the authors are supported by the tests they perform, and should watch for speculation (speculation may be acceptable, so long as it is clearly stated that it is such).

Reviewers then provide a thorough report back to the editor, including specific comments/suggested edits from throughout the paper. Reviewers will provide a recommendation to the editor indicating whether they think the paper is in need of revisions (“major” or “minor” revisions), or if the paper should be rejected or accepted. Major and minor revisions are the most common reviewer recommendations – major often means further analyses are needed before the hypotheses have been sufficiently tested, minor usually means that the methods and results are sound, but the authors need to tweak a few paragraphs, interpretations, or graphs throughout the paper. Papers that are considered “accepted” are exceptionally well done, and the reviewer may only have small comments that need to be addressed, or possibly none. Papers that reviewers “reject” usually have insufficient evidence to accurately test the hypotheses proposed, may have critically flawed methods or analyses, or would require very extensive revisions that would take a long time to complete, or would end up testing a different hypothesis. Rejections do not always mean that the authors should abandon the paper – it could just mean that there is more work to do before the paper can be fully evaluated. Some journals even have a “reject and resubmit” option, which means that the paper is rejected for now, but that the authors are welcome to resubmit in the future if they are able to address the reviewer’s concerns. It is sort of like “very major revisions” and gives the authors a bit more time/flexibility to complete the revisions.

Revisions

Once the editor has received reviews back from all of the reviewers, they will go through all of them to see if the reviewers have picked out common flaws in the paper, and to make sure the reviews were sufficient. If the reviewers clearly disagreed on something, the editor will often send out the paper to at least one other reviewer for another opinion (this is helpful if a reviewer was unnecessarily harsh or lax). Based on all of the reviewer evaluations, the editor will provide the final recommendation for the paper (accept, reject, reject and resubmit, major/minor revisions). The editor then sends their recommendation and summary of the reviews, along with all of the reviewer comments, back to the authors. 

The authors must then revise the paper based on the reviewer feedback, and address every single comment made by the reviewers. It is the job of the authors to not be defensive about the comments (which can be hard when someone is criticizing your work), but it is important to remember that the reviewer’s job is to make your science better. Depending on the amount of revisions requested (major or minor), the authors are usually given at least 2 weeks (and sometimes several months) to provide their revisions, as addressing every single comment thoroughly takes time. The authors then resubmit their revised paper, as well as a list of their responses to all of the reviewer comments and the actions taken to address each comment. The editor uses both documents to determine if the authors have done due diligence with the reviewer’s feedback, or if further revisions are needed.

If necessary, the editor will send the paper back to the original reviewers or to new reviewers. The process will repeat until the paper becomes acceptable to reviewers, or the paper is rejected. Once the paper is considered acceptable by the reviewers and editor, the peer-review process is complete and the paper is ready to be formatted and published in the journal! It can take anywhere from weeks to years for a paper to become accepted! 

Responsibilities of reviewers

It is important to note that neither authors, nor reviewers are paid (editors at larger journals are sometimes paid positions). Instead, peer-reviewing is considered an “academic service” and authors should expect to review 1 – 2 papers for every paper they publish (i.e., for each review of your paper, you should return the favour by reviewing that many papers). While some people have strong opinions on monetary compensation for reviewers and editors, the current justification is that reviewing is a service and the lack of compensation should keep reviewers impartial. The peer-review process is a lot of work for everyone involved, but is the best way to ensure we have a system that produces sound, thorough, and accurate science.

Peer-review doesn’t just happen in journals, either. Scientific books, text books, theses, and government reports may also be considered peer-reviewed, as they are usually thoroughly reviewed by several experts, or scientific review panels. But the most common form, and the most acceptable form of citations or sources, are peer-reviewed journal articles. Peer-review also occurs for articles in other fields in academia, such as history and the arts.

So, how do you tell if what you’re reading (or what you’ve heard) is credible?

Has it been peer-reviewed?

Is the information coming from a reputable peer-reviewed journal?

Do they cite their sources when stating information/presenting facts?

Even if the information isn’t presented in a journal (e.g., a governmental report, book, or blog post), do they use citations to support their arguments? Are these sources credible (i.e., from peer-review sources, not some random internet link)?

Do the majority of scientists/experts in the topic agree with this opinion?

If you come across a “fact” that a scientist has stated, remember that not all “opinions” are created equal. If the majority of experts have come to a conclusion, yet one person disagrees, that person has most likely failed to properly refute a hypothesis (their conclusions do not match the majority of the evidence). This usually happens when a scientist fails to include all of the appropriate variables in their methods, meaning that the test they used to refute the hypothesis was flawed, even if their work has been published. For example, those that claim global warming has happened before and that therefore the global warming we are experiencing today is just natural variation are failing to include an important variable: the rate of change (which is much faster than any past “background” variation). 

Is the author an expert on the subject?

It takes several years to gain expertise in a topic, mostly by reading all of the peer-reviewed papers on that topic (hundreds or even thousands of papers), staying up-to-date on new research, conducting experiments, and going through the peer-review process. Google searches don’t cut it. Even if they are a scientist, if they normally work in a different topic, there is a greater chance that they might be missing something that is common knowledge to experts in that field. For example, I as a palaeontologist am not about to try and write a paper on black holes, even though I think they’re fascinating and have read lots about them. 

An imperfect system

The peer-review system is not infallible. Nowadays, scientists will often “publish” their work online outside or ahead of the peer-review process with things called pre-prints. Pre-prints allow scientists to share their work, especially large datasets, ahead of peer-review so that they can share their work more quickly and potentially get feedback from other researchers. Often, the data included in pre-prints will end up going through the peer-review process, but as the peer-review process can take a long time, pre-prints allow researchers to get their data out there and get feedback faster. While it may not seem as rigorous because it hasn’t gone through the peer-review process, it can actually end up being more transparent because it potentially allows more people to review the research. Essentially, pre-prints still go through “peer-review” in the actual sense of the word, just not necessarily through the traditional channels of journals.

Journal reviewers can also sometimes act inappropriately. For example, reviewers might make unhelpful comments that are not constructive or based on the science, or may even be downright abusive or derogatory – e.g., criticizing the author, not the work, or saying something unnecessarily rude. While these kinds of comments are not permissible in the peer-review process, and it is usually the responsibility of the editor to reject reviews that include inappropriate content, these kinds of things regularly slip through. It is then within the author’s right to ask the editor to step in and find an alternative reviewer or to ignore the comments when making their final decision. These kinds of checks and balances are what help the peer-review process to remain as impartial as possible – comments must be limited to the science and the presentation of material, and cannot include opinions or feelings about the work, even if it disagrees with your own.

Finally, just because something gets published doesn’t mean it’s perfect. There are lots of bad papers out there that slip through the peer-review process. Editors and reviewers are people too. That is why scientists must always evaluate previous work for themselves. It is an inherent part of the scientific process – trying to independently reject that null hypothesis to see if you come to the same conclusion.

Building a Character Matrix

Jen here – 

Interested in understanding how we take morphological data from extinct animals and use them to infer an evolutionary history? These trees can be used as a framework to test different macroevolutionary questions regarding species distribution, paleoecology, rates of change, and so much more! We hope to set the stage to explain how each step is done! First things first, constructing a character matrix. 

Before really diving into anything specific, I would suggest you think a little about evolution, phylogeny, and all the basic terminology that goes into this field. I would recommend that you work through The Compleat Cladist: A Primer of Phylogenetic Procedures. This is effectively a workbook that walks you through terms, concepts, and more!

This isn’t meant to be an exhaustive guide but rather set you up to explore the program and generate a test character matrix!

Step 1: Learn about your study group

This will involve a LOT of reading and diving into the history of the animals you are interested in. In some instances this is easy, in others it is very difficult! I won’t dwell on this too much but it’s easy to forget where to begin. I would start by using Google Scholar to research your group of interest plus evolution, morphology, phylogeny. Then you will probably have to head to the library armed with a list of literature that is much older than you to really begin your deep dive. Remember that ideas change through time, so starting at the beginning is really valuable to learn how ideas have changed!

What is important is that you also learn about homology and work to understand the homologous elements of your critters. Homology is simply similarity due to inheritance from a common ancestor. The understanding and evaluation of homology may be different depending on the group you are looking at. For example, echinoderms have been considered this way for a while now and there are several schemes. One takes into account the body as a whole and how the elements are connected, the other takes a more specific approach looking at specific plates around the mouth. These are not mutually exclusive schemes but can be used in concert with one another. Another good thing to remember is that some people like to think they are more correct than others – who’s to say, really. Just make sure you do your own homework to form your own opinions and ideas. 

Step 2: Organize your information

There are several ways to do this, you could simply store information in Excel or Google Sheets or you could use a program designed for curating character data. I have used Mesquite for this. Mesquite is freely available software that is 

“…modular, extendible software for evolutionary biology, designed to help biologists organize and analyze comparative data about organisms. Its emphasis is on phylogenetic analysis, but some of its modules concern population genetics, while others do non-phylogenetic multivariate analysis. Because it is modular, the analyses available depend on the modules installed.”

You can easily describe your characters, add new taxa, remove taxa, import or draw a tree and see how characters change across different tree topologies. 

Here is the barebones starting place. I set up a new file and said I wanted three taxa and three characters. Now I can go in and start editing things!

 

There is a side tool bar where you can easily start to modify the matrix. So you can change the taxon names, add taxa, change characters, add characters, delete whatever you want, and a lot more that I haven’t really messed around with! I suggest that if you are a first time user, you spend some time with your fake matrix messing around. Once you get a sizable dataset in here, it’s best you don’t make any mistakes! Figure out where you may go awry and troubleshoot ahead of time.

Here is my edited matrix where I’ve added in three taxa and three characters. Notice at the bottom where it shows a character and the different states that are available. So when you edit the matrix you can use numbers or the character state – numbers are easier!

 

An easier way to import your characters and the different states is to use the State Names Editor Window.  This shows you the list of your characters and all the different states it can have – you can easily edit these and it’s a nice way to organize the characters since in the character matrix the text is slanted and kind of hard to read.

Character matrix with the character list on the far left column and the states spanning the rest. The states can be whatever you want – which is where bias can slip in so don’t forget to refer back to your knowledge base and understanding of homology.

 

The functionality of Mesquite extends quite beyond this. If you are looking for tutorials or to push the limits of the program here is some further reading:

Step 3: Export your matrix for analysis

Extensive export options via Mesquite!

File > Export will give you a series of options to export your file, don’t forget to also regular SAVE your file so that you can revisit your matrix to easily add to it! Most programs that infer phylogenies require a NEXUS file. This type of file has your matrix and often a bit more information about what you want in the analysis or information about the characters. I would suggest using your favorite plain text editor and exporting a few different types so you can see how they are structured and why certain programs may want different files and different information!

 

Counting Deep Sea Sediments

Adriane here–

As paleontologists and paleoceanographers, sometimes the analyses we do involve complex equations, time-consuming geochemistry, or large amounts of computational time running models. But every now and then, we gather data using a method that is simple and fast. Today, I want to talk about one such method that I use quite often in my research. These data are called biogenic counts.

In previous posts, I’ve written about the deep-sea sediments I use in my research, such as sampling the cores we drilled from the Tasman Sea, and processing these samples once they are back in the lab. Each sample, which is stored in a small vial and represents 2 cm of the core (or 10 cubic cm of material), contain pieces of hard parts of plankton and animals, as well as minerals. These minerals and biogenic pieces, then, can tell us about our oceans and the life it held millions of years ago.

Biogenic count data is just that: I dump the sediment samples onto a tray and count the number of ‘things’ that are in that sample to determine the percentage of each ‘thing’ there. ‘Things’ in the sediment fall into a couple different categories: benthic foraminifera (foraminifera that live on the bottom of the seafloor), planktic foraminifera (foraminifera that float in the upper part of the water column in the open ocean), echinoderm spines (the hard parts of things like star fish and sea urchins), foraminifera fragments (pieces of foraminifera shell that are broken), sponge spicules (the hard parts of sponges that look like spiked glass), and I also make note of any minerals that are found in the sample. In one day, I do about 10 samples, which doesn’t seem like much but adds up everyday!

Below I’ll go  over the exact steps I take when performing biogenic counts:

A) An image of one of my jarred samples. B) The microsplitter used to split samples. Notice that the sample being poured in is split between the two cups on either side.

First, I take the jarred sediment and split the sample using a micro-splitter. A micro-splitter is a tiny contraption that equally ‘splits’ the sediment into two holders. Because each sample contains tens, maybe even hundreds of thousands of particles, there’s no way we could count all of that! So instead, splitting the sample down to a reasonable number of particles allows us to more accurately and quickly count the number of particles in each sample, which we can then use to get a percent of each ‘thing’ (e.g., benthic foraminifera, fragment, echinoderm piece) in each sample.

Generally, I try to split the sample until about 300 particles remain in one of the cups. This can take splitting the sample anywhere between 3-9 times, depending on how much sediment is in each sample to begin with. Once I have the ~300 particles, I then sprinkle them evenly onto a picking tray (a metal tray with a grid on it). I then count the number of each ‘thing’ on the picking tray. I keep count of each ‘thing’ using a counter, which makes the process very fast and easy!

An image of my picking tray with the sample sprinkled on it. Some of the major components, or ‘things’, in the sediment are labeled. Most of them are planktic foraminifera, which can be very small or larger. There are a few benthic foraminifera, several fragments, and only one piece of an echinoderm spine. Generally, planktic foraminifera are most common in these samples.

Once I have this information, I then put them into a spreadsheet to plot the data. One thing I haven’t mentioned yet is, why we do this and gather the biogenic count data. It’s actually very useful! We can use the percentages of each ‘thing’ in the sediment to calculate the ratio of planktic to benthic foraminifera. This tells us something about dissolution, or if the bottom waters were corrosive and dissolved the fossils, as benthic foraminifera are a bit more resistant to this corrosion than planktic foraminifera. I also calculate the planktic fragmentation index, which is another ratio which also indicates dissolution (the more dissolved a foraminifera is, the easier it is to fragment).

Thus, the biogenic count data is a quick but extremely useful method to determine the percent of each ‘thing’ in a sample, which can be used to infer something about the corrosive nature of bottom waters, which in turn can tell us something about ocean circulation from millions of years ago!

 

 

 

Curating a Personal Fossil Collection

Cam here –

Cretaceous Fossils from Mississippi (Part 1)

Fossil collecting can be fun and a rewarding experience. It helps us get a perspective of how rich and diverse the fossil record is. Some of us make personal collections of the fossils we find. Collections typically start with fossils and other rocks mixed together with little to no record of where the specimens you collected came from. My way of collecting fossils has changed over the years as simply piling rocks on my bed headrest to buying drawers and cabinets to store the specimens and keeping a record of them by creating a log book and keeping label cards with every specimen in each drawer. There are many different ways to curate your collection. At the end of the day, it is all up to you.

Fossil Collections (Part 3), (Echinodermata, Blastoidea), (Row 2)

When creating a collection or collecting fossils, you want to make sure you know exactly where that fossil came from. Location is probably more valuable than the fossil itself. You can’t always rely on your memory. What I have done is printed out labels and write information down with a black ink pen. There are about 30 labels on each sheet so I have a good amount. I write additional information on the back such as the date, coordinates (if accessible) and more recently the drawer name in which that specimen is stored. It is OK not to have information about your specimen. You can always leave the location section with a question mark or “Unavailable”. Make sure you fill it out the card with information to the best of your abilities.  

Filled in label

Finding things to store your specimens in depends on how delicate and how large the specimens are. Large to small boxes with padding are good things to have. You can find these boxes at hobby shops and arts and crafts stores. Clear jewelry and bead bags are also very useful as well. With all of these boxes and bags combined I keep most specimens in cabinets and drawers. I label each drawer sometimes by location, age, phyla, or by fossil content. It is all up to you. The majority of my drawers are ClearView desk organizer drawers. You can find these at a Walmart in the craft sections and craft stores.

Organizing a collection can be fun but it can also take up space. Make sure you do have room and not stack things too much on top of each other. I have had almost half of my collection collapse on me for doing that. Have fun with it!

Labeled ClearView drawers

Data Management

Jen here – 

I started a job as a Research Museum Collection Manager in September and a large part of it is specimen based. I handle donations, reconcile loans, look for specimens for researchers, organize the collection, and manage other types of data. Now that my job has moved to largely remote I wanted to share some of the things my museum techs and I have been working on to keep our projects moving forward. 

When we think about museums we immediately think of the beautiful displays of mounted dinosaurs and ancient deep sea dioramas that transport you through time. However, there are many research museums that are essentially libraries of life (thanks, Adania for that phrasing). Similar to libraries with books, these institutions hold records of life on Earth and they are massive. At the University of Michigan Museum of Paleontology we have over 2 million invertebrates, 100 thousand vertebrates, and 50 thousand plants. Each of those specimens is tied to other records and data!

Specimen Database

Digital databases allow for the storage of data related to the specimen including location, time period, taxonomy, rock formation, collectors, and much more! Depending on the type of database the structures are slightly different but the overall goal is the same: create an easy way to explore the specimens, see what is on loan, where they are located in the collection, and if they are on display!

Databases, like regular software, get updates over time. The database I’m working in was started ~10 years ago and there have been a lot of updates since then so we are working to upgrade the way the data are organized. For example, now there are different fields that didn’t exist before so we are making sure the data are appropriately entered and then fixing these fields. We are also digitizing our card catalog to verify that the specimen data in the database matches the physical records. We have three card catalogs: Type specimens, Alphabetical taxonomic groups, and Numerical. I spend time scanning in these cards and my museum techs help transcribe and verify the data with our other records. 

Example of a card from the University of Michigan Museum of Paleontology invertebrate card catalog. Many are typed index cards with information on the specimen.

I have quite a few donations that have new specimens that need to be put into the database. To do this, I format the dataset and upload it to the database. Seems straightforward but it takes some time and isn’t the most fun task so I have a stockpile of them to get through while I continue my remote work.

Loan Invoices

One of the tasks we had started before the COVID-19 crisis was to digitize our loan documentation. We have documentation for specimens that we loan out to other institutions, for specimens we bring in to study, and any transfers that may occur. This information had not been digitized so our first step was to scan the paperwork and transcribe key information such as: Who were these specimens loaned to? How many specimens were loaned? Were specimen numbers listed? Where these specimens returned? 

We now have a large spreadsheet which now allows us to search this information rapidly. For example, when we are working in the collection sometimes we find specimens with paperwork or that are out of place. Now we can search the number, see if they were on loan, and make sure we close this loan as being returned. In some cases, we cannot find specimens so I have to reach out to colleagues at other institutions to see if they have a record that the loan was returned. Then it’s up to us to find the specimens in the collection and get them into their proper storage places.

Three-Dimensional Fossils

The last big project we are working on is to get new fossils ready for our online fossil repository: UM Online Repository of Fossils. This involves some on site work at the collection space and lots of post-processing of the fossils. We use a camera to image a fossil from many angles (photogrammetry) and then stitch the photos together to create a three-dimensional fossil. If you are interested in our protocol and set up please check out our website by clicking here. Most of this work has been done by me alone but I am working on ways to incorporate our museum techs into different aspects of the process that can be done at home, such as cleaning the output model and orienting the specimen for final display on the website. Check out our most recent invertebrate addition: Hexagonaria percarinatum.

Example of a species profile on UMORF! Click here to head to the page and explore the viewer.

3D Visualization Undergraduate Internship

Hey everyone! It’s Kailey, an undergraduate student at the sunny University of South Florida.

The image shows a specimen, Gyrodes abyssinus, sitting on a mesh block with a scan via geomagic wrap on the screen in the background.

I wanted to take some time and share with you guys an amazing opportunity I was given earlier this year. As any ambitious college student will tell you, internships are extremely important when it comes to choosing a career path. Not only do they grant students hands-on experience in a particular field, but also general time and knowledge in the workforce. Good internships are hard to come by, which is why I was elated when I got the opportunity to intern at the 3D visualization lab at USF! 

And yes, the lab is as cool as it sounds.

For a place where complex research happens daily, the mission of the lab is rather simple: to harness 3D scanning equipment and data processing softwares. These technological tools have been a wonderful addition to the arts, the humanities, and STEM everywhere, as it has not only supported, but completely transformed, the research in these worlds. This dynamic lab embodies the philosophy of open access research and data sharing, meaning that scientists and researchers from all over the world are able to use its different collections and visit historical sites from the comfort of their homes and offices.

This image shows the Faros arm scanner extended.

My job at the lab was to scan and process some specimens from the department of geosciences’ paleontological collection. The first step in this process is to use a laser scanner and scan my object in various positions (figure 1) using the FaroArm scanner (figure 2). This bad boy has three different joints, making the scanner move around any object seamlessly. The FaroArm also has a probe with a laser, which is essentially taking a bunch of pictures of the object and overlays them. An important note is that these “various positions” need to be easily and manually connected in a software called Geomagic Wrap; therefore, every scan must seamlessly match up like a puzzle! This was probably the most difficult thing to learn, as you not only must think more spatially, but pay close attention to the small, yet distinguable,details, like contour lines and topography (figure 3). In some cases, these small details mean the most to research scientists by showing things like predation scarring and growth lines.

This image shows a close-up shot of the contour lines and topography on the 3D model.

Once the scan is connected and we have a 3D model, the file is switched to a different software called Zbrush. This is where the fun and creative aspects come in! Zbrush allows users to fill in any holes that appear in the scan and clean up any overlapping scan data. This happens when the scans aren’t matched up properly in Geomagic. Next, we paint texture onto the model using different pictures of the fossil. Then, voila, you have a bonafide 3D model (figure 4). The model shown in figure 4 is of Gyrodes abyssinus Morton, a mollusc from the Late Cretaceous. 

I completed a total of three data scans and processes, but was cut short due to the coronavirus pandemic. While my time at the lab was short, I learned so much in terms of technical skills and problem solving. However, the most notable thing I learned was just how interdisciplinary science and research operates at the university level. Networking with archeologists, geologists, anthropologists, and so many more opened my eyes to the different fields contributing to the research world. The experiences I gained at the 3D visualization lab will follow me through my entire academic career.

This is an image of the final 3D model of Gyrodes abyssinus with coloration and texture.

You can visit https://www.usf.edu/arts-sciences/labs/access3d/ for information on the 3D lab and visit https://sketchfab.com/access3d/collections/kailey-mccain-collection to view the rest of my collection.

International Ocean Discovery Program Early Career Workshop

Adriane here-

Earlier this year before the world went into lock down, I had the opportunity to participate in an early career researcher (ECR) workshop through the International Ocean Discovery Program (IODP). The workshop was focused on how to write a scientific drilling proposal with colleagues and friends.

The workshop was held at Lamont-Doherty Earth Observatory in Palisades New York, just north of New York City. At Lamont, scientists and staff manage U.S. scientific support services for IODP, the major collaborative program which, among several other things, allows scientists to live and work at sea for two months drilling and studying sediment cores. The workshop was specifically for early career researchers, which is loosely defined as a researcher who has gained their Ph.D. but has not achieved tenure (that critical phase in a professor’s career where they receive a permanent residence at their college or university).

The Gary C. Comer building on Lamont’s campus, where the IODP ECR workshop was held.

This workshop, which first ran a few years back, was conceived between Time Scavengers’ own Dr. Andrew Fraass and his close colleague, Dr. Chris Lowery. They, along with their colleagues, built the workshop and it has run every 2-3 years since its conception. What is so neat about the workshop is that it is also run and organized by other ECRs, with the help of more senior scientists.

The first day of the workshop focused on introducing the attendees to aspects of IODP. These included presentations on the past and future of scientific ocean drilling and the IODP proposal writing process. We also did participant introductions, where we stood up and had 1 minute to talk about ourselves, our research, etc. using only images on one slide. We, the participants, were also broken out into groups later in the day by themes we identified ourselves as (for example, I indicated I was in the Biosphere group because I work with fossil and am interested in evolutionary questions). From these breakout groups, we then identified 5 places in the Pacific Ocean we would like to target for drilling. Later that night, the workshop organizers held a networking reception for us at a nearby building on campus. The networking event was incredibly cool (they fed us dinner, and it was really great food) and useful (I had the opportunity to meet and speak with other ECRs who have similar interests as myself).

My introductory slide. The upper left box contained our image, name, and association; the upper right box contained a research image (I cheated and included two) and our research interests in three words or less, the bottom left box contained our research expertise and any contact information, the bottom right box contained a mediocre skill we have (again I cheated and used this to plug this website).

The second day of the workshop, we arrived and discussed how to obtain data for a drilling proposal. Just to give some insight into what goes into a drilling proposal, this is a 15+ page document in which scientists write out their hypotheses, where they want to drill on the seafloor, preliminary data that says something to support the hypotheses outlined, and what we call site survey data. Site surveys are when scientists take smaller ships out with an apparatus pulled behind the ship. These apparatuses use sonar to map the features of the bottom of the seafloor, but also the properties of the sediment below the seafloor. The changing densities of the different sediments appear as ‘reflectors’, allowing an MRI-like preliminary investigation of the sediments in which the scientists want to drill into. An entire presentation was dedicated to obtaining older site survey data. We also heard presentations about the different drill ships and drilling platforms implemented by IODP. The second part of the day was again spent working in groups. This time, however, we split ourselves into different groups depending on what area of the Pacific Ocean we were interested in working on. I put myself with the group interested in drilling the southeast Pacific, off the southern coast of New Zealand. Here, we began to come up with hypotheses for our proposals and begin to write those down.

Example of a seismic image from a seismic site survey. The very strong, prominent lines in here are called ‘reflectors’. This image shows the location of a proposed drill location, named SATL-56A. From this seismic image, we can interpret that the top layers of ocean sediments are very flat. The seafloor, which is recognized based on its more ‘spotty’ appearance and lack of horizontal lines, is very prominent here (the top of which is indicated by the green reflector line). These images are essential to include in a drilling proposal so everyone has an idea about what might be expected when drilling.

The third and fourth days of the workshop included limited presentations, with more time dedicated to letting the groups work on their proposals. One of the main outcomes of the workshop is to have participants walk away with an idea of how to write a drilling proposal, but also to have the basic groundwork in place for a proposal with a group of people who share similar interests. So ample time was given for the participants to refine their hypotheses, find some preliminary data about their drilling locations from online databases, and build a presentation to present to the entire workshop. On the afternoon of the fourth day, the teams presented their ideas to everyone, including more senior scientists who have submitted drilling proposals in the past and have worked on panels to evaluate others’ drilling proposals.

All in all, this was a great workshop that really allowed for folks to learn more about the IODP program, where and how to find important resources, and how to begin writing these major drilling proposals. These events are particularly important for scientists from marginalized backgrounds and first-generation scientists. For me (a first-generation scientist), making connections with others is sometimes very difficult, as I have terrible imposter syndrome (when you feel like you don’t belong in a community and that you will be found out as an imposter) and am hyper aware that I was raised quite differently than most of my peers. Being in such a setting, with other scientists, forced to work together, is terrifying but also good because I had the opportunity to talk to and work with people I would not normally work with. For example, I had wonderful discussions with microbiologists and professors whose work focuses more on tectonics, people from two research areas which I hardly interacted with previously.

Interning at a Paleontology Lab

Haley here –

A Busycon coactatum, or Turnip Whelk, specimen.

I recently started interning with Dr. Sarah Sheffield (one of Time Scavenger’s collaborators and USF professor) at the University of South Florida (USF)! As a high school senior, this has been an extremely influential experience to me. With Dr. Sheffield, I have been learning how to catalog fossils, and I have been slowly (but surely!) entering the USF collections to an online database (MyFossil). Along with learning how to photograph and catalog fossils, I have been able to learn about graduate and undergraduate research, sit in on a college level course, learn about fossil identification and photography (1), and meet some amazing people.

I was able to have this internship experience through a class called Executive Internship that is offered to seniors at the high school I attend. This class spends the first nine weeks teaching career skills like writing a resume and cover letter, interview etiquette, and effective communication. Throughout the first nine weeks, we are encouraged to research various career paths that interest us and speak with people working in those careers. By the end of the nine weeks, we are expected to have secured an internship. The school partners with various businesses (such as the Florida Aquarium) in order to ensure that students have options. Some students use these connections, while others choose to intern with businesses they have been to before or reach out to family and friends for suggestions. Others -like myself- email everyone they can think of to see what they would think about having a high school intern. I was fortunate enough that one of the people I reached out to suggested that I speak with Dr. Sheffield, and I was even more fortunate that Dr. Sheffield found a project that she wanted to start and was willing to allow me to help. After everyone has secured an internship and we have completed the first nine weeks of class, all of the interns are given permission to sign out of school instead of attending Executive Internship class. In exchange for essentially leaving a class before school ends, all of the interns are required to record an average of five hours per week in our internship and complete a weekly log. Other than the hour requirement and log, internship schedules and tasks vary for each student.

I can not say when I first became interested in paleontology and geology, but this internship experience has only helped my interest grow. When trying to explain why I wanted to study those fields, my mom helpfully explained that I had “always been a rock girl” which sums it up pretty well. From family trips to North Carolina when I was in elementary school, I became fascinated by the variety of gems and minerals you could find. As I took more science classes, I learned about crystal structures and how various formations occur. In a public speaking class, I was able to pick any topic I wanted and ended up falling into a rabbit hole of the history and changes of paleo-illustration. I think part of what draws me to these fields is how seamlessly they integrate with so many other fields. From chemistry and biology, to history or art, there are so many aspects of paleontology and geology that can combine with other fields. In any case, there is always something new to learn and something to dig deeper into that can reveal so much. This is only highlighted by my experience sitting in on Dr. Sheffield’s class. The class addresses the evolution of life on Earth, but reaches implications of what we truly define as Homo sapiens, the history of paleontology in the United States, biomechanics (how organisms move), and much more.

Bryozoan encrusting on a Busycon carica (Knobbed Whelk) specimen.

When I started my internship, I was unsure what to expect. I am a high schooler, and I was going to be working with a college professor to begin a new project. I was excited to learn, but I can say that I definitely did not expect to come home and tell my parents that I wished there were more Anadara (2) specimens because they were the most fun to photograph. I learned the conventional lighting angles to use for fossil photography, how to measure various shells, and the information needed to catalog fossils. Properly labeled fossils soon became a valued commodity after some specimens only had labels like “bivalve”, “east coast”, or “recent” specimens. To catalog the specimens, I have been using the MyFossil database. It is an extraordinary website that allows museums and researchers to share their specimens so that they are available world-wide. It is amazing to know that I can catalog a specimen and see it appear online next to a trilobite specimen from China and a shark tooth from California. MyFossil has an important feature that allows specimens with detailed information (classification, dimensions, geochronology, and locality) to be marked research grade. This allows MyFossil to function as both a free online museum and as a valuable tool to researchers.

Learning how to catalog fossils entails learning about fossils just from exposure. I have learned about the variety of features of shells and how they function for each species. In order to revise some entries to make them research grade, I have used a website called Macrostrat. By looking up geochronology based on lithostratigraphy or formation, I have begun to recognize the common  rock units of various sites in Florida. I have been able to learn more about fossil features by asking how to denote various characteristics like boring. A notable specimen of snail had a bryozoan encrusting (3). By cataloging fossils and asking about them as I do, I have learned much more than what I expected to learn from the cataloging labels.

This internship has been a great learning experience. I was admittedly unsure of what I wanted to do in college, other than the fact that I wanted to do something with geology and fossils. Interning has allowed me to learn, discuss projects with others, and see the sheer variety of research within the USF School of Geosciences. This, paired with everyone’s enthusiasm for their research, has helped me see the kind of environment that I want to be a part of. It has been an opportunity that has allowed me to gain a better understanding of the college experience, and it has allowed me to have hands-on research experience in the field that I love. I look forward to expanding what I have learned even more through the rest of my internship!