All posts by atascientific

Go Beyond Traditional Microscopy: How To Use Label-Free Techniques for Live Cell Imaging

Live Cell Imaging is a major area of growth in the microscopy world as researchers search for techniques to elucidate further mysteries of the human body. Traditional microscopy systems often fail to automatically identify individual cells due to lack of contrast and while labelled techniques can produce high contrast images, they ultimately perturb the cells and can be phototoxic. This can ultimately limit the type of cell that can be used and the duration that they can be imaged before measurement-induced cell behaviour changes emerge.

Given these challenges, the search for a panacea can be futile as an ever-increasing number of technologies refine previous versions, all of which have a valid use and solve a question.

The process of defining the desired outcome of an experiment can be clouded by the latest and greatest shiny new toy that is a ‘must have’ for any self-respecting imaging facility. Is it 2D or 3D? Can we do spheroids? What about angiogenesis, Matrigel, bespoke slides, polylysine, fibronectin, or even hydrogel matrices? Do I need super-res, and what about a new confocal? Do we all still fluoresce – how many Photons?

Clearly, it’s a minefield of instrumentation – but what about applications of your science?

We’re in a position to discuss some interesting solutions to application needs.

Let’s put the quest to dive deep into cellular activities to one side, and let’s consider how cells interact with each other in a natural state. Have you ever thought about how cells react to fluorescent probes? Do they alter how the cells interact with each other? What about intense light? High light intensity does not bode well for the humble cell, are you perturbing the cells?
Population studies are normally the realm of cytometry, not the microscopist’s bread and butter.
Perhaps a system that allows both population and individual analysis would be of value, helping to understand kinetic data about your cells such as speed, displacement, Euclidean distance, meandering index and directionality associated with your treatments. Do you wish to track them? Understand their lineage? You could then re-culture the cells without any reservation or take them for testing elsewhere. Phasefocus in Sheffield, UK, have developed the Livecyte that employs a different modality in imaging called Ptychography – this method suits Live Cell Imaging with its 650nm <1mW laser that delivers incredibly low light toxicity. The instrument creates a diffraction pattern and stacks these patterns to run through the ptychographic algorithm, resulting in images that emulate the resolution of fluorescently tagged cells, ultimately allowing for excellent segmentation and tracking. They also have thought of the needs of scientists grappling with all this data. Whilst it is open for external analysis programs such as Image J, Cell Profiler and Metlab, the Livecyte has some incredible dashboards built into the software designed for popular applications such as Mitotic Time, Wound Healing, Angiogenesis, Proliferation and Morphology.

Ok, shall we dive into these cells now? Consider you have viewed the subtle interactions in your cells, identified some interesting behaviours on the Livecyte and now you would like to put these cells through a confocal microscope for depth selectivity, optical sectioning or to create 3D images. Damn – it is tied up for the next month! Maybe we should consider a benchtop confocal! Let’s make it simple and Laser free! Now, this is interesting if you are averse to long term costs of ownership – have you ever added up what you spend on all the confocal microscope maintenance? Aurox in Oxford, UK, has added a new system to their already successful “Bolt-On” Clarity range of confocal Microscopes with the soon to launch Unity. This super small, sit on the lab bench system has the hallmarks of a winning combination of ease of use (yes finally a confocal without the need of a flight licence to operate it) and powerful imagery. The spinning disk method they have patented is elegant and fundamentally simple. Normally the pinhole arrangement of a spinning disk allows for around 1% of light to be utilised, hence the need for high power lasers. Aurox has developed a novel optical arrangement that uses over 50% of the Cool LED light. Images are produced at a very rapid rate – say 150 images in 11 seconds – without the photobleaching evident in high intensity lasered systems and no need for laser safety restrictions. Often, it is difficult to obtain both high resolution and widefield images, let alone the ultra-large field of view – full overview images of the sample/mount can be detected with the sCMOS detector in the Aurox.

In keeping with the need to make it easy, the instrument is operated with a large format tablet over a secure wireless network with no more messy cables. They have a software called Visionary – a truly simple to use, easy to access web browser with a network connection that enables z -stack, time-lapse and multichannel imaging with OME-TIFF formats.

As instrument manufacturers increase the production of technology, generally they create efficiencies in manufacturing and the economies of scale kick in. As a result, one usually experiences a price drop in subsequent iterations of the model, and additional R&D costs are absorbed by higher sales volumes. This might be a common occurrence, but is not always the case…

I have noted a rise in the cost of a relatively simple technology employing basic fluorescence as its method of choice to perform live cell imaging inside an incubator. I doubt this rise is a result of additional functionality due to component pricing – not all systems need to be action packed with all capabilities to suit a broad range of users. People have also communicated to me their abhorrence that such an expensive system be locked away into individual researchers’ incubator. Clearly, there are always exceptions to the rule – and at times the application may warrant it. I think we have found a solid alternative at a fraction of the price with an impressive imaging system to boot.

Logos Biosystems in Seoul, South Korea, have developed the Celena-S and Celena-X – an on the bench High Content Imaging system. It is a nice little system, being fully automated, multi-channel, and having Multi-Well and Multi-position analysis – All on your bench – with a stage top incubator! The tech is more than sufficient for the applications it is used for, and the images are exceptional with lightning fast laser autofocus.
What I like the most about the Celena – X is the speed and quality of the images, particularly for this price. I guess the proof is in the pudding with this one – I suggest a demo.

Looking for the perfect analytics instrument for YOUR next big discovery?

Speak with the ATA Scientific team today to get expert advice on the right instruments for your research

Request free consultation

How Live Cell Research is Changing the Face of Biological Research

If you want to see in real time what’s going on inside living cells, then you should be performing live cell imaging. Live cell imaging techniques allow real time examination of almost every aspect of cellular function under a variety of normal and experimental conditions, including:

Stem cells

Stem cells are so sensitive, conventional fluorescence microscopy (the imaging of live cells at high resolution over long periods) is lethal. That’s why researchers often image dead or dying cells and miss dynamic processes that are essential to our understanding of cell biology.

With the live cell imaging techniques of today, cell biology is in the midst of a microscopy boom. Stem cell researchers are coming up with inventive techniques for capturing live images in their native habitats, and researchers can catch glimpses of stem cell activity by focusing their scopes on cells cultured in dishes or in tissue extracted from animals. Observing living stem cells under such conditions is resulting in medical advancements for cancer, degenerative diseases such as Alzheimer and heart disease.

Mitosis

Perhaps the most amazing thing about mitosis is its precision, a feature that has really come to light with advances in light microscopy. Researchers now know that mitosis is a highly regulated process involving hundreds of different cellular proteins and the dynamic nature of mitosis is best appreciated when this process is viewed in living cells. Advances in fluorescence live cell imaging have allowed scientists to study this process in great detail, providing important insights into the biological control of this process and how it might go wrong in diseases such as cancer.

Why live cell research?

The study of cells is the study of basic biology, and living cells offer one of the most accessible models of biological processes scientists have. Continual advances in imaging techniques and design of fluorescent probes improve the power of this approach, ensuring that live cell imaging is an important tool in biology.

Live cell imaging is the study of living cells using time-lapse microscopy. It was pioneered in the first decade of the 20th century – one of the first time-lapse microcinematrographic films of cells ever made was by Julius Reis, showing the fertilisation and development of the sea urchin egg. Since then, several microscopy methods have been developed, which allow researchers to study living cells in greater detail and with less effort.

Live cell research techniques

The growing number of live cell research techniques means you can obtain greater amounts of information without stressing out your cells (or yourself). The three most common techniques are:

Widefield Fluorescence Microscopy

The most basic technique for live cell imaging, widefield fluorescent microscopy yields valuable results if you are imaging adherent cells, large regions of interest (such as organelles) or very thin tissue sections (less than 5 metres). A CCD camera is used to capture images, and then the epi-fluorescence illumination source can be a mercury lamp, xenon lamp, LED’s, etc. Each of the light sources require carefully matched interference filters for the specific excitation and emission wavelengths of your fluorophore of interest.

Widefield and Contrast

Widefield fluorescence microscopy can be used in combination with other common contrast techniques such as phase contrast and differential interference contrast (DIC) microscopy. This combination is useful when performing live-cell imaging to examine general cell morphology or viability while also imaging regions of interest within cells. The combination of contrast and fluorescence microscopy is usually carried out in two separate image captures, using the transmitted light for contrast followed by epi-fluorescence imaging for detection of fluorophores. The two images are then combined in post-image-analysis.

Optical Sectioning

Optical sectioning can alleviate blurring, since only information from the region that corresponds to the objective depth of field is extracted. This technique also enables volume rendering of stacks to generate three-dimensional images. Aside from using confocal techniques, optical sections can also be obtained in widefield fluorescence microscopy using structured illumination.

With all live-cell imaging experiments, the main challenges are to keep your cells alive and healthy over a period of time while they are on the stage of the microscope. Your cells must be kept in a temperature- and pH-stable environment, which is usually achieved by using a chamber where the cells are placed, or larger environmental chambers around the microscope itself.

Benefits of live cell research

With live cell imaging, kinetic processes such as enzyme activity, signal transduction, protein and receptor trafficking, and membrane recycling (endocytosis and exocytosis) can all be interrogated and:

  • Cellular enzymes and other cytosolic molecules remain in the cell
  • Scientists can observe dynamic cellular processes as they happen
  • Cellular structures can be studied in their native environment, meaning less experimental artifact
  • Cellular biomolecules and structures can be tracked over time
  • Interactions between cells can be observed

Factors to consider, however, include:

  • Cells must be kept in their natural physiological ranges for pH, temperature, and osmolarity
  • You must have a specific way to label your target, whether it is a molecule, a cellular function, or a cellular state (and illuminate it with minimum toxicity)
  • Living cells are not generally permeable to large molecules (i.e., antibodies)
  • Moving objects can be more difficult to keep in focus
  • Interrogation techniques can be harmful to living cells

To best manage this, you need to make a plan.

Making a plan

When considering a live cell imaging experiment, it is critical to devise an experimental plan. Successful live cell imaging experiments can be a major technical challenge and the conditions under which cells are maintained on the microscope stage, although widely variable, often dictate the success or failure of an experiment.

An important caution is to ensure that cells are in good condition and function normally while on the microscope stage with illumination in the presence of synthetic fluorophores or fluorescent proteins. The goal is to design your experiment to be as non-invasive as possible, since fluorescent imaging can have unwanted side effects due to illumination, which isn’t something your cells are exposed to in the incubator.

The good news is that normal atmospheric oxygen tension levels are suitable for most cultures. With regard to osmolarity, most of the cell lines have a large tolerance for osmotic pressure, with good growth at osmolarities between 260 and 320 milliosmolar. When cells are grown in opennplate cultures or Petri dishes, hypotonic medium can be used to cope with evaporation.

Choosing Livecyte from Phasefocus

Live-cell imaging requires you to keep the cells functioning during the experiment, while being able to assess whether the experimental method is causing problems that will complicate the interpretation of your results. With Livecyte from Phasefocus (a unique system for live cell analysis that enables the study of phenotypic and kinetic behaviour of individual cells and cell populations over hours or days), this process is made simple.

Livecyte uses an optimised version of Quantitative Phase Imaging (QPI) called Ptychography to generate quantitative data without the need for cell labelling. Livecyte exploits the inherent contrast mechanisms that cells possess; refractive index and thickness variations to produce high contrast, high fidelity images without halos or speckling. Cells can be observed with minimal perturbation which is especially useful for primary cells and stem cells.

With Livecyte, you can:

  • Directly measure cell motility and separate cell motility from cell proliferation
  • Characterise morphological and behavioural cell phenotypes during wound healing
  • Perform non-invasive time-lapse imaging to quantify cell death without labels, dyes or phototoxic damage
  • Automatically identify cells undergoing mitosis and extract the mitotic index
  • Calculate cell death dose response curves without the use of fluorescent labels
  • Analyse and extract parameters to identify heterogeneity within mixed cancer cell populations
  • Enable investigation of combination therapies on primary cancer cultures
  • Understand the underlying mechanisms of angiogenesis.

Livecyte comes complete with automated cell tracking software (patent pending), which can follow all cells for a complete time-course, even if those cells pass over each other. Cells are always in focus irrespective of focal drift or uneven sample holders.

To learn more about Livecyte from Phasefocus download the product brochure or contact ATA Scientific to make a product enquiry. We’re the specialist analytical instrument provider for Australia and New Zealand, and we offer expert support with every decision you make.

Contact us to discuss your requirements.

Looking for the perfect analytics instrument for YOUR next big discovery?

Speak with the ATA Scientific team today to get expert advice on the right instruments for your research

Request free consultation

Recording Available From mRNA Vaccine Therapy Workshop

Masterclass: Nanoparticle formulation

Recordings from the recent Advances in mRNA Vaccine Therapies in Cancer Immunotherapy and Infectious Diseases Livestream

Nanotechnologies present new opportunities for advancing medical science and disease treatment in human health care.
At Precision NanoSystems, we strive to create and drive knowledge into the community through the Nanomedicine Innovation Network (NIN). On April 24, we held a livestream event on mRNA vaccine therapies at the University of Strathclyde with two of the industry leading experts.  Below are the recordings from their talks.

Dr. Justin Richner earned his doctoral degree from the University of California at Berkeley. He performed his post-doctoral studies at Washington University in St. Louis studying viral immunology and vaccine development. In this setting he developed, in collaboration with Moderna, a Zika virus vaccine with the novel mRNA-lipid nanoparticle gene therapy platform. Recently, he moved to the Department of Microbiology & Immunology at University of Illinois Chicago.
Click below to view his presentation on his Zika virus vaccine research and how today he is further refining the mRNA-LNP vaccines to overcome immune-suppression and combat infectious diseases which have proven a significant challenge to traditional vaccine development.

 Click here to watch video
Dr. Yvonne Perrie is an internationally recognised researcher with a strong track record of high impact publications, with approximately 100 peer reviewed manuscripts, 5 textbooks and 5 patents. Dr. Perrie has effectively delivered in various academic and motivational leadership roles including Associate Dean for Learning and Teaching, Director of the Medicines Research Unit, and Head of Pharmacy. In 2016, she moved to the University of Strathclyde as part of their Global Talent Appointment Programme. She gained her PhD from the University of London.

Click below to watch Dr. Perrie present on her multi-disciplinary research focused on the development of drug carrier systems to facilitate the delivery of drugs and vaccines, providing practical solutions for current healthcare problems.

Click here to watch video

 Stay tuned!

Coming in late May through June we will have a GMP 4 part series with the industry’s top leaders in clinical development.

Want to find out how the NanoAssemblr Benchtop ®  helped in advancing their research and if it would be a good solution for advancing yours?
Let’s Talk 

How Nanoparticle Tracking Analysis Compares to Dynamic Light Scattering for Measuring Particle Size

Nanoparticle Tracking Analysis (NTA) and Dynamic Light Scattering (DLS) are complementary techniques that offer different insights into your samples. DLS will generally measure a wider size range than NTA, but NTA offers greater resolution than DLS (even with Multi-angle Dynamic Light Scattering).

In an ideal scenario, you may want to consider a combination of both systems to take advantage of the complementary information the two techniques can provide.

What is Nanoparticle Tracking Analysis?

NTA provides real time monitoring of the subtle changes in the characteristics of particle populations with all analyses confirmed by visual validation. Measurements take just minutes, allowing time-based changes and aggregation kinetics to be quantified .

NTA makes practical and effective use of the properties of both light scattering and Brownian motion to gather the nanoparticle size distribution of samples in liquid suspension. This is particularly important in making real time measurements such as in the study of protein aggregation, viral vaccines and exosomes/microvesicles.

NTA visualises, measures and characterises virtually all nanoparticles (10 – 2000 nanometres). A monochromatic light source (laser beam) is passed through the sample chamber and picks up the particles in suspension in such a manner that they can be easily magnified. Enhanced by a near-perfect black background, particles appear individually as point-scatterers moving under Brownian motion. Polydisperse and multimodal systems are instantly recognisable and quantifiable , and when engaged in fluorescence mode, suitably labeled particles can be discriminated from the non-labeled background.

A video camera captures a video file of the particles moving under Brownian motion and using the Stokes Einstein equation, calculates individual particle hydrodynamic diameter. The ‘movie’ is stored at a rate of 30 frames per second and results may be outputted to a spreadsheet format.

NTA is a three to five step measurement process . The process of loading the sample into the cell and getting results can take as little as two to three minutes, and you can run batches of samples under the same conditions and directly compare results. Samples are prepared in an appropriate liquid (typically water-based) at a concentrated level of 107 − 109 particles/ml and placed in the sample chamber which has a volume of 0.3 ml.

What is Dynamic Light Scattering?

DLS is the most common method of measurement for particle and molecular size analysis in the nanometer range. Typical applications are emulsions, micelles, polymers, proteins, nanoparticles or colloids.

Based on the Brownian motion of dispersed particles, DLS can be used to determine the size of small particles in suspension or polymers in solution.

The basic principle of DLS is simple: the sample is illuminated by a laser beam and the fluctuations of the scattered light are detected at a known scattering angle by a faster photon detector.

The diffusion of particles moving under Brownian motion is converted to size and a size distribution using the Stokes-Einstein relationship. Non-Invasive Back Scatter technology (NIBS) is incorporated to give the highest sensitivity simultaneously with the highest size and concentration range.Measurement of size as a function of concentration enables the calculation of k D , the DLS interaction parameter. The Microrheology option uses the DLS measurement of tracer particles to probe the structure of dilute polymer and protein solutions.

How do the two compare for measuring particle size?

The technique of Nanoparticle Tracking Analysis utilises the trajectories of individual scattering objects observed under a microscope and their displacement related to each object’s size. Dynamic Light Scattering on the other hand, utilises a technique where the intensity fluctuations in the scattered light are analysed and related to the diffusion of the scattering objects.

In a study published in Pharmaceutical Research, NTA was shown to accurately analyse the size distribution of monodisperse and polydisperse samples . The presence of small amounts of large (1,000 nm) particles generally does not compromise the accuracy of NTA measurements, and a broad range of population ratios can easily be detected and accurately sized. NTA proved to be suitable to characterise drug delivery nanoparticles and protein aggregates, complementing DLS.

DLS measures changes in scattering intensity from a bulk sample, whereas NTA measures observed particle diffusion directly, particle-by-particle. This provides a series of key advantages of the NTA method . NTA can give a wealth of additional information beyond particle size and:

  • can detect samples 10 – 1000 times more diluted than DLS
  • requires no information about collection angle, wavelength or solvent refractive index
  • can give the % by number of aggregated particles directly .
  • provides direct measurement without modeling or assumptions
  • can offer higher resolution of peaks if you have polydisperse distributions
  • can quickly and effectively provide the number weighted size (ie. if you need to show the smallest nanorods)
  • allows you to selectively look at only a fluorescently tagged part of the distribution.

Why it’s important to detect size and count of nanoparticles

It’s essential that every effort is made to classify nanoparticles and access their safety, ensuring they’re fit for their purpose. Knowing the size distribution of individual nanoparticles offers a competitive advantage in many industries and fields of research .

NTA instruments offer the best opportunity for gathering accurate data and instruments. For example, the Malvern NanoSight NS300 can automatically track a range of different sized particles simultaneously.

Malvern NanoSight NS300 provides high resolution size distribution, concentration, protein aggregation and viscosity measurements for individual nanoparticles while a fluorescence mode allows differentiation of fluorescing particles. Events such as aggregation and dissolution can be monitored as they occur. The software is specifically designed to meet the needs of both new and experienced users and Standard Operating Procedures (SOPs) can be easily set up. For a more in-depth analysis of results, NS300 can offer customised reporting and full access to raw data.

Want to learn more about Nanoparticle Tracking Analysis and the Malvern NanoSight NS300? Download the product brochure today or contact us here .

Looking for the perfect analytics instrument for YOUR next big discovery?

Speak with the ATA Scientific team today to get expert advice on the right instruments for your research

Request free consultation

How to Describe a Particle Using a Single Number — Understanding Equivalent Sphere Theory

Why is laser diffraction one of the most used particle size analysis techniques?

Laser diffraction is one of the most popular methods for particle size analysis due to its dynamic nature and range. Laser diffraction can be used to determine a variety of particle sizes in a range of substances such as liquid suspensions, dry materials, and aerosols.

However, it is important to note that different particle size analysis technologies can quite often produce different results for the same sample. The logical reason for this is that each particle analysis measurement technique measures a different part or aspect of the same material.

For this reason, all particle size analysis results must be considered as the best indications possible — rather than definitive and exact measurements.

What does it mean to describe a particle?

One unique number cannot be accurately given to describe the size of a three dimensional shape. Whilst this is true in many situations and circumstances, it’s particularly relevant when attempting to describe complex shapes, such as a grain of sand or even a particle within a can of paint.

Although it may be hard for many to understand the need for such minute measurements, many industries today rely on the ability to use a particle size analyser to measure the size of incredibly fine particles. Quality Assurance Managers within specific industries and organisations may need to know whether the average size of particles has increased or decreased since the last production run.

For all ground and milled materials (such as coffee, powders, minerals, etc), it is the particle size that is produced that typically determines the performance of the product which if not optimised can impact the manufacturing efficiency and increase overall costs. Particle size can affect a wide range of material properties including, reaction and dissolution rates, how easily ingredients flow and mix, or compressibility and abrasivity. In order to simplify the measurement process, it is often convenient to define the particle size using the concept of equivalent spheres.

What is equivalent sphere theory and how is it used?

As with the example offered in the previous section, we very often seek to describe a shape by only one number.

However, this is problematic as a sphere is the only shape that can be accurately described by one number. Therefore, in order to arrive at a particular number to explain the size of a shape, equivalent sphere theory is frequently used.

Using the equivalent sphere theory, some property of the particle is measured and it is then assumed that this refers to the diameter of a sphere to describe the particle. This essentially mean that three or more numbers do not have to be used to describe a three dimensional particle. Although it is more accurate to describe three dimensional particles with three or more numbers, it is an inconvenient practice and can quickly become unmanageable.

Even the smallest particles are multidimensional and it is very hard, and problematic, to describe a multi-dimensional particle using only one dimension.

As there is only one shape that can be described by one dimension – a sphere – all techniques that measure particle size relate this to an ‘equivalent sphere’.

Techniques for reporting particle size

Laser diffraction is a popular and preferred measurement technique when reporting on particle size and also has the advantage of being one of the most accurate ways of measuring the size of a particle.

However, laser diffraction is not the only particle size analyser available. There are many instruments that can be used and each one utilises different measuring techniques.

In order to understand more about the different results that are obtained through particle measurement and analysis, it is useful to be aware of some commonly used methods, such as:

  • Sieves: This is one of the oldest and most traditional of all the methods of measuring particle size and is often used as it is cheap and reasonably effective when measuring larger particles.

However, the use of a sieve makes it impossible to measure very fine particles such as within sprays or emulsions. Reproducing results is also extremely difficult, especially when the ‘wet sieving’ technique is applied.

  • Sedimentation: This is another technique that has been around for a very long time and has long been used to measure coarse grained soils (such as sand) and has been used in the ceramics industry. However, the calculation of size with the sedimentation technique is only valid for spherical particles.

If particles are not spherical, the reports will be different to the results. There are also difficulties with measuring emulsion particles, and it is also necessary to know the density of a material in order to easily produce accurate results.

  • Electrozone Sensing:

Electrozone Sensing is particularly useful for measuring the size of blood cells but it poses some problems as a technique for measuring industrial materials. Samples must be suspended in a salt solution so it is not possible to measure emulsions and many dry powders.

In addition to these limitations, measuring the particle size within sprays cannot be done.

Despite these alternative methods,laser diffraction continues to remain the most accurate way of measuring particle size.

Advantages of laser diffraction include:

  • Provides a great degree of flexibility for the analysis of different materials
  • Results and answers can be provided quickly (in less than one minute)
  • It is an absolute method of particle size distribution analysis
  • There is no need to calibrate instruments against a standard
  • It is possible to measure dry powders, suspensions, emulsions, sprays and many other materials
  • The technique provides a wide and dynamic range of measurement
  • The measurement of an entire sample is possible
  • The technique can be repeated quickly and easily

While there are a range of ways to measure and analyse particle size, the most appropriate way of measuring any given material will, and should, depend on the type of material that is being measured.

Making use of the Mastersizer 3000 system

The flexibility of the Mastersizer 3000 system allows the user to develop a robust method for every type of sample and, compared with other laser diffraction systems, the Mastersizer 3000 significantly broadens the range of materials and applications to which this measurement technique can be applied.

Contact the ATA Scientific team to receive a quote today.

Looking for the perfect analytics instrument for YOUR next big discovery?

Speak with the ATA Scientific team today to get expert advice on the right instruments for your research

Request free consultation

Key Advantages and Challenges in Laser Diffraction for Particle Analysis

Laser diffraction is a popular particle sizing technique because it offers a number of advantages. However, laser diffraction is not without its challenges. Sampling errors can vary the results in any particle sizing research — particularly when measuring larger particles.

In this article, we will cover the advantages of laser diffraction, as well as the inevitable challenges it faces. We will also discuss the Mastersizer 3000, the world’s most popular particle sizing instrument, as the best solution for particle size analysis and identify a best practice sampling method.

Advantages of laser diffraction

The spherical modelling theory remains the only accepted and logical choice used in a commercial device intended to analyse a wide range of samples, regardless of the actual particle shape and size.

Laser diffraction advantages include:

  • An absolute method grounded in fundamental scientific principles. In this method, it’s not necessary for an instrument to be calibrated against a standard. However, validation of equipment is possible to prove that it’s performing to a standard that can be traced.
  • Wide and dynamic range. In measuring particle size, a reliable analytical instrument will allow the user to measure particles between approximately 0.01 to 3500 microns in size.
  • Flexibility. Laser diffraction offers new possibilities for measuring materials. It’s even possible to measure the paint that’s sprayed from a nozzle in a paint booth. The pharmaceutical and agricultural industries are two of the many industries that have benefited greatly from such advances.
  • Dry powders. Even dry powders, from cohesive to fragile materials, can be measured through the technique of laser diffraction. Although this may result in a poorer level of dispersion than if a liquid dispersing medium was used, dry powders can be directly measured using a dry powder dispersion accessory like the Malvern Aero S. In combination with a suspension analysis, it can support the assessment of the amount of agglomerated material in a dry state.
  • Liquid suspensions and emulsions. It’s possible to use a recirculating cell to measure liquid suspensions and emulsions. This technique promotes a high level of reproducibility and facilitates the use of dispersing agents and surfactants to determine the primary particle size. If it’s possible to do so, it’s preferable to take measurements in a liquid suspension.
  • Sample measured. This technique allows for the whole of the sample to be measured. As the sample passes through the laser beam, diffraction is measured for all particles.
  • Rapid. This technique is so rapid that results can be derived in one minute or less. Feedback can therefore quickly be provided and repeat analyses can also be made quickly.
  • Repeatable. This technique is highly repeatable and knowing that the results can be relied upon ensures compliance to any requirements of the regulatory authorities (ISO 13320:2009 and USP 429)
  • Scope for additional light sources. Laser diffraction particle size analysers don’t just measure simple diffraction effects.Light sources that don’t make use of lasers are sometimes used to enhance the primary laser source to reveal extra information about particle size and shape.

Challenges of laser diffraction

Like many other scientific undertakings, sampling and method of procedure are crucial in the final data interpretation. Despite the prevalence of laser techniques, there are circumstances which warrant the need to visually confirm the outcome using an orthogonal tool (i.e. microscopy).

One of the biggest challenges for laser diffraction methods is getting a representative sample out of a larger bulk product.

Sampling errors are the largest source of variation in any particle sizing experiment (including laser diffraction), especially when it involves measurement of larger particles.

It’s also essential that the sample preparation method is tailored to the material being measured. Particularly, one must choose between wet and dry dispersion, with aspects such as the natural state of the sample, its potential to be dispersed and the volume of the sample all coming into consideration.

Wet dispersion is the most commonly used method due to its suitability for a wider range of samples.

A good way to ensure sampling and dispersion are adequate is to look at repeatability and reproducibility of results:

  • Repeatability is a measure of the stability of a single sample (measured multiple times) and will indicate if a sample is well dispersed.
  • Reproducibility looks at several sub-samples of the same material and provides insight into the effectiveness of a sampling procedure.

Whilst modern equipment can give quite precise results, it can never be assumed that the size of particles (produced through laser diffraction or any other type of particle sizing measurement) won’t differ from their true dimension.

Choosing the right solution

The Mastersizer 3000 is a state-of-the-art particle size analyser that produces robust and reproducible data for particle size analysis.

The Mastersizer 3000 operates with the principle of laser diffraction, which states that particles will scatter light in different ways depending on their size. Analysis of the scattered light’s angular intensity leads to accurate results relating to particle size distribution.

Some of the Mastersizer 3000’s most important features include:

  • Measurement range. It’s a particularly broad one, ranging from 10 nm to 3.5mm. Using a series of detectors the Mastersizer accurately measures the intensity of light scattered by particles within a sample for both red and blue light wavelengths and over a wide range of angles which enables high resolution especially at the ‘sub-micron’ level.
  • Minimal footprint. The optical bench spans no more than 690mm.
  • High performance in both wet and dry dispersions. The Aero S dry powder dispersion accessory aids in dry dispersion through the adjustable hopper, different feed tray designs and a choice of two different venturis — one that uses shear forces to disperse the sample and another that uses impaction — whilst the Hydro range accessories handle wet dispersion extremely effectively. Particles are delivered to the measurement area of the optical bench at the correct concentration and in a suitable, stable state of dispersion.

This allows the system to deal equally efficiently with a variety of materials ranging from pigments to milk powder and coffee.

  • Intuitive software. Every measurement is delivered quickly and easily, and the software is extremely powerful and intuitive with user guided workflows.
  • Rapid Analysis.The Mastersizer 3000 captures light scattering at a rate of 10,000 snapshots per sec, with typical measurement times of 10sec for even polydisperse samples.

Best practice sampling preparation method

In order to overcome the challenges outlined earlier, the following three-step best practice sampling method should be followed when operating the Mastersizer 3000:

  1. Preparing the sample. There are two dispersion modes that can be utilised — wet dispersion, for aqueous or organic dispersants, and dry dispersion, for powder samples. This ensures that everything from coarse granulates to incredibly finely dispersed emulsions is covered.
  2. Measurement. The accuracy of the laser diffraction technique is dependent on two factors: the stability and wavelength of the light source, and the sensitivity of the detector array. As we have previously touched on, the Mastersizer 3000 features a top-of-the-line detector array, with red and blue light sources capable of resolving materials as small as 10 nanometres and as large as 3.5mm in size. As a result, even highly polydisperse samples are able to be measured accurately.
  3. Reporting. Utilising the Mie theory of light scattering, the Mastersizer 3000 measures angular intensity of light scattering in order to determine a particle size distribution. From there, results will be presented volumetrically, and measurement parameters can be tracked in real time. This allows for immediate analysis and data comparison alongside defined standards.

The Mastersizer 3000 is the latest particle size analyser that is capable of providing accurate and fast distributions in both wet and dry dispersions.

Making use of the Mastersizer 3000 system

The flexibility of the Mastersizer 3000 system allows the user to develop a robust method for every type of sample and, compared with other laser diffraction systems, the Mastersizer 3000 significantly broadens the range of materials and applications to which this measurement technique can be applied.

Contact the ATA Scientific team to receive a quote today.

Looking for the perfect analytics instrument for YOUR next big discovery?

Speak with the ATA Scientific team today to get expert advice on the right instruments for your research

Request free consultation

Congratulations to our winners for August 2018

The ATA Scientific Encouragement Award is hosted four times each year and provides young scientists access to financial assistance to enable them to collaborate with peers at scientific meetings and to launch their careers within their field of study. The topic of our latest competition focused on the strong link between cognitive behaviour and foods. While studies continue to show evidence that supports the idea that food can influence our thought processes, we asked our readers to consider their own experiences and whether particular foods had any influence on their ability to think and concentrate. 

After much deliberation, three entries were selected to receive our award– first prize at $1500 and 2 runners up at $600 each.

Congratulations to our first prize winner Samantha Wade, PhD candidate at the Targeted Cancer Therapeutics Laboratory, Illawarra Health and Medical Research Institute, University Of Wollongong, working under the supervision of Dr Kara L. Vine-Perrow.

Congratulations to our runner up, Cameron McKnight. Cameron McKnight is a PhD student based at the Murdoch Children’s Research Institute (MCRI) in the lab of Prof. David Thorburn. 

Congratulations to our runner up, Dr Michal Bartnikowski, a Postdoctoral Research Fellow from the School of Dentistry, at The University of Queensland.  Michal is currently working with supervisor Professor Sašo Ivanovski on the development of 3D printed scaffolds for bone and periodontal regeneration. 

For further details please visit our previous winners page or contact us.

Looking for the perfect analytics instrument for YOUR next big discovery?

Speak with the ATA Scientific team today to get expert advice on the right instruments for your research

Request free consultation

Congratulations to our young scientist encouragement award winners Nov 2018

The ATA Scientific Encouragement Award is hosted four times each year and provides young scientists access to financial assistance to enable them to collaborate with peers at scientific meetings and to launch their careers within their field of study. The topic of our latest competition focused on the possible existence of a highly advanced extra-terrestrial civilisation observing us on earth and whether they should impose on us a fair and sustainable civilisation or keep well away. We were delighted to receive so many high quality responses, all of which were deserving winners. Each entry was scored based on originality, relevance and level of entertainment.

OUR WINNERS

Three entries were selected to receive our award– first prize at $1500 and 2 runners up at $600 each.

Congratulations to our first prize winner, Miss Caroline Holley, second year PhD student at the Institute of Molecular Bioscience, University of Queensland working under the supervision of Associate Professor Kate Schroder.

Congratulations to our runner up, Ms Sabrina Schönborn,international Master student at the Institute of Health and Biomedical Innovation, Queensland University of Technology, working under the supervision of Dr Elena M. De-Juan-Pardo.

Congratulations to our runner up, Mr Terence Tieu. Terence is a joint PhD candidate between Monash University and CSIRO, working under the supervision of Dr Anna Cifuentes-Rius, Dr Helmut Thissen and Prof. Nico Voelcker.

For further details please visit our previous winners page or contact us.

Looking for the perfect analytics instrument for YOUR next big discovery?

Speak with the ATA Scientific team today to get expert advice on the right instruments for your research

Request free consultation

 

Dental Applications of Scanning Electron Microscopes

What are scanning electron microscopes (SEMs)?

Where traditional microscopes use light waves, scanning electron microscopes (SEMs) use a beam of electrons to scan a sample’s surface and produce images of high magnification with a great deal of topographical information. Micrograms produced by SEMs are critical in research, development and ongoing improvements for dental and orthodontic practice.

The Phenom desktop SEM

The Phenom ProX SEM is a tabletop scanning microscope able to provide magnification of up to 150 000x . This powerful capability is combined with the ease of use of a traditional microscope and Phenom SEMs fit comfortably into most laboratories and dental surgeries. They are most suitable for universities and research, design and manufacturing facilities, where high magnification imaging and ease of use are foundational requirements.

Research and development applications

Studying dental wear with SEMs

In 2014 the Department of Surgical and Morphological Sciences at Insubria University in Varese, Italy conducted a study on four different types of dental wear:

  • Erosion
  • Attrition
  • Abrasion
  • Abfraction

The three scientists that conducted the study, Luca Levrini, Giulia Di Benedetto and Mario Raspanti, all from the University’s Oro Cranio Facial Disease and Medicine Research Centre, used a scanning electron microscope to ‘clarify the different clinical and diagnostic presentations of dental wear and their possible significance’.

They found that each lesion type had specific morphological and etiological factors and that identifying these factors allowed for the identification of complex nondental disorders, such as acid reflux. This experiment highlights the benefits of using an SEM in dental research.

Wire bending and salivary pH on surface areas

Scanning Electron Microscopes were a key measuring tool in an experiment conducted by Progress in Orthodontics into the effect of saliva pH on dental wiring. In this study, an SEM was used to observe:

  • A control group
  • Bent wiring exposed to artificial saliva
  • Straight wiring exposed to artificial saliva

The SEM’s high magnification allowed observation of the differences between the control group and the wiring exposed to saliva pH. In particular, the SEM allowed the experimenters to differentiate between striations caused in the manufacturing process of the writing and surface irregularities.

The experiment was a joint study between The Orthodontic Department of the College of Medicine and Dentistry at James Cook University, the School of Dentistry at the University Medical Centre Groningen and the School and Hospital of Stomatology at Wenzhou Medical University. They found that bending has ‘a significant influence on surface roughness and mechanical properties of rectangular SS archwires’ and that ‘pH has a synergistic effect on the change of mechanical properties of stainless steel (SS) wires along with wire bending.’

SEM in depth profiling PET analysis

An article in American Laboratory (via ResearchGate) used scanning electron microscopes to examine the changes in fibre morphology upon SVI treatment at a variety of temperatures.

The purpose of this experiment was to examine ways bulk polymers could be functionalised without modifying their potential for electronic or optical purposes. AFM-IR, which combines atomic force microscopy and infrared spectroscopy, allowed for measurements of the depth of precursors into fibres, with the aim of shortening development and processing times.

Without the use of an SEM, such depth of analysis of the polyethylene terephthalate fibres would not have been possible.

Evaluating the bond strength of dental base brackets using SEMs

A research project conducted by the University of Messina evaluated the bond failure of metal brackets bonded to enamel.

The paper, titled ‘Evaluation of Bond Strength and Detachment Interface Distribution of Different Bracket Base Designs’ and published in Acta Medica Mediterranea, showed through the use of scanning electron microscopes that brackets with greater mesh spacing had the best bond results, with double layer mesh patterns in 80/150 gauge double mesh having the best bond patterns. In this research, SEM usage aided the researchers in identifying the types of mesh that were effective for enamel bonding.

Using SEMs to chart the effects of solvent evaporation

In dental and orthodontic treatments, the effects of solvent evaporation on water sorption and solubility can have significant effects on the restorative techniques of dentin bonding materials.

An experiment conducted by the School of Dentistry at the University of Brasilia used SEM micrographs to examine nanoleakage patterns prior to, and after evaporation. Conducted over a series of days, the micrographs allowed researchers to compare and contrast different solvents under different conditions. This kind of high magnification analysis would not be possible without an SEM.

SEMs in dental and orthodontics

Scanning electron microscopes are an indispensable tool in dental and orthodontic research and development. The capacity for high magnification analysis is particularly critical to improve health, aesthetics, cost and function of materials and techniques.

The current magnification for the Phenom Pharos SEM can reach up to 1,000,000 x. !!

Phenom provides an all in one imaging and analysis system. Elemental analysis is built into the desktop system to give point and click element identification. The Phenom desktop SEM is extremely fast and easy to operate taking just 30 seconds to display an SEM image after loading a sample.

Contact us for a quote.

 

Looking for the perfect analytics instrument for YOUR next big discovery?

Speak with the ATA Scientific team today to get expert advice on the right instruments for your research

Request free consultation

Watch Surface Tension and Interfacial Tension Measurements – Online Recorded Demonstration Videos

We are pleased to present you with this group of demonstration videos showing the latest technology from the Attension series of Tensiometers. Fully modular, these systems offer accurate and versatile measurements of static and dynamic contact angle, 3D surface roughness, surface free energy, surface, interfacial tension, interfacial rheology and more.

For a personal demonstration using the Attension Theta system within your lab, please contact us!

VIDEO 1 – Theta Optical Tensiometer offers accurate measurement of surface wettability and adhesion between gas, liquid and solid phases.

VIDEO 2 – Theta 3D Topography measures both contact angle and surface roughness in a single measurement, useful for biocompatibility studies

VIDEO 3 – Theta high pressure module measures wettability and interfacial tension at high temperature and pressures for enhanced oil recovery

VIDEO 4 – Sigma offer high precision surface & interfacial tension measurements (Platinum Du Noüy ring & rod, Platinum Wilhelmy Plate)

VIDEO 5 – Fully automated critical micelle concentration, dynamic contact angle, surface free energy, powder wettability, sedimentation and density.

Looking for the perfect analytics instrument for YOUR next big discovery?

Speak with the ATA Scientific team today to get expert advice on the right instruments for your research

Request free consultation