Sponsored Content by EntegrisJun 15 2021
In this interview, Mark Bumiller, Technology Manager at Entegris talks to News Medical Life Sciences about the role nanoparticles can play in drug delivery.
Could you give our readers some background on Entegris?
With over 50 years of experience, Entegris is a global leader in Advanced Materials Science and strives to help customers utilize advanced science-based solutions to support and innovate faster and more efficiently.
How are nanoparticles typically defined?
Typically when we talk about nanoparticles, we talk about particles where one of the dimensions is in the range of 1 to 100 nm. Both the ISO definition and the ASTM definition align with this measurement range.
The European Commission has a more complicated definition because they are more concerned with toxicity and exposure to consumers from manufactured nanomaterials than nanoparticles used for drug delivery. The Commission states that 50% of the particle distribution should be between 1 and 100 nanometers, with specific risk assessments required based on this figure.
While nanoparticles are generally considered to have at least one dimension under 100 nm, that definition sometimes gets stretched in terms of the upper size limit. For instance, Wikipedia’s information on nanoparticles for drug delivery for the brain provides a definition which states that particles with one dimension between 10 to 1000 nm (1 micron) can be understood to be nanoparticles.
Image Credit: Shutterstock/lookerstudio
Can you provide an example of a typical nanoparticle application, for example, in vaccines?
Looking at typical nanoparticles is another way of defining these. For example, a solid lipid nanoparticle could be used to encapsulate a drug or things like the messenger RNA present in the vaccines that we are taking.
There is a single layer of a phospholipid on the exterior surface. This makes the particle hydrophilic, allowing us to effectively hide organic molecules inside the particle. This particle will also be comfortable in an aqueous dispersion.
When we work at the nanoscale, we can also do interesting things on the surface of particles. For example, we can functionalize the surface, adding ligands targeting antibodies, drugs, or peptides. This helps the particles seek out things like tumors to selectively adhere to that surface to deliver the therapeutic payload.
What are some key advantages of working at the nanoscale?
There are various advantages to working at the nanoscale. One of the key benefits is an improvement in the surface area to volume ratio. For example, if we look at the example of three cubes, each time we halve the size of a cube, we see an increase in surface area to volume ratio. As we move to smaller sizes, we gain proportionally more surface area for a given volume or given mass of the drug particle.
This is important because, at these smaller sizes, more of the atoms are actually on the surface of the particle. More atoms on the surface of the particle will mean that the particle will be more reactive.
Particles at these smaller scales will also start to acquire unique properties; for example, some drugs exhibit better water solubility at a smaller scale, enabling higher circulation time of the drug through the patient. We are also better able to fine-tune the drug’s controlled release properties.
How important are accurate size measurements in ensuring drug solubility and bioavailability?
Accurate size measurement is critical. A lot of new drugs never get their approval because of poor water solubility, so by decreasing the size, you can improve the solubility and the bioavailability of drugs while better ensuring they meet regulatory requirements.
For example, reducing the size from 100 microns down to 10 microns, then down to 100 nm will lead to a significant increase in the surface enlargement factor. We also start to see quicker dissolution at this scale, and this can be understood using the Noyes–Whitney equation.
Overall, there is a direct relationship between the size of the particle and the dissolution rate. Being able to measure the size of particles accurately can help us to predict these dissolution rates. It is worth noting that at smaller sizes, we often need to stabilize the drugs - this is often done by adding surfactants to the surface of the particles.
Image Credit: Shutterstock/EMkaruna
How does the Entegris Nicomp® support particle size measurements at the nanoscale?
The Entegris Nicomp® provides us with a lot of data on nanoparticles. The instrument offers two algorithms for use with using dynamic light scattering. Gaussian distribution will always set the results against a single symmetric peak. There is also a proprietary Nicomp® algorithm, which splits that result into two individual peaks. This algorithm can also resolve peaks that are very close, allowing different distributions to be identified, even when their sizes are relatively close to one another.
This information is especially important information when formulating and processing these drugs, as it offers the ability to understand whether or not we are dealing with a single peak or multiple peaks in the distribution of the particles being processed.
Image Credit: Entegris
Can you give our readers an overview of the principles of dynamic light scattering?
Dynamic light scattering (DLS) is the most popular technique for performing size measurements at the nanoscale, though some people do use TEM and some other novel techniques to try to acquire these particle sizes.
In dynamic light scattering, particles move according to a random thermal motion that we call Brownian motion. Small particles move faster and larger particles move slower. A conventional DLS system includes a number of standard components.
The light source projects light into the sample, while a detector (sitting at 90 degrees) will count photons, measuring a change in count rate due to the Brownian motion of the particles. This will build up as a correlation function. The Stokes-Einstein equation can then be used to convert this diffusion coefficient and calculate the radius of the particle.
Overall this is a fairly easy technique, though people sometimes struggle with sample preparation and data interpretation.
What are some current and recent trends in the market for drugs at the nanoscale?
As well as a wide range of current drugs being on the market, there is also a lot of R&D taking place. In terms of sales of the Nicomp® instrument in the pharmaceutical industry, at least 50% of these sales are to research organizations and research universities where researchers are working to create new drugs, or investigating drugs currently under development.
There are already drugs out in the field and saving lives that are based on the nanoscale, with many of these employing liposomes and polymers. Nanocrystals are also popular, and these originally came from the Elan nanocrystal milling technique. We also see a range of inorganic particles, micelles and proteins used in current drugs.
Liposomes were first discovered and formulated in 1965. The first nanoparticle drug that entered the market was Doxil or Doxorubicin, which came about in 1995. The Doxil drug was encapsulated into a liposome delivery device, with various other drugs also using liposomal bases, such as Myocet, Marqibo, Genexols and micelle.
Other current examples of nanoparticles in pharmaceuticals include the NanoTherm treatment (which uses an iron oxide nanoparticle) and the BIND-014 particle (a block copolymer of polyethylene glycol and PLGA).
What approaches are used to fine-tune controlled release qualities in drugs at the nanoscale?
When we work with the nanoscale, we can try to fine-tune the controlled release qualities of drugs. To do this we need to look at the pharmacokinetic data for the nanoparticle in question. We tend to report the drug concentration in the patient as a function of time, and in many standard examples this will drop down to zero fairly quickly.
However, if we take an identical dose of the drug and encapsulate this in something like the BIND-014 nanoparticle, the concentration of the drug in the patient will last much longer, allowing the drug to keep recirculating and continuing to address whatever issue it is designed to address. This improves the profile of the drug concentration, hitting the places where it needs to provide therapeutic benefit to the patient.
Another useful technique is to use an oil-soluble lipid layer. This can be used to effectively hide drugs that are oil-soluble, or drugs that are water-soluble in the hydrophilic core, allowing us to deliver both hydrophilic and hydrophobic drugs within an aqueous suspension. We can also undertake service modification to help targeted delivery.
What is ‘zeta potential’ and how does the Nicomp® DLS system make this available for drug discovery applications?
Zeta potential has existed as a concept for quite some time, with instruments designed to measure zeta potential now being fairly common.
If a particle has a negative charge on the surface, it will attract positive ions, which will build up. Negative ions are then attracted to the positive ions, building up something referred to as an electric double layer.
Zeta potential can be understood as a potential, measured in millivolts a slight distance from the surface of the particle. We call this distance the slipping plane, and at that distance, those ions move with the particle when it moves.
We measure this by placing the particle in a cell and using electrodes to apply an electric field. The direction that the particle moves in tells us whether it is positively or negatively charged, and we can measure the speed the particle moves at to determine the magnitude of the charge. This is another fairly simple measurement to make.
The Journal of Controlled Release recently published data relating to zeta potential in terms of particle size. The study in question looked at the stability of a series of nanoparticles, specifically a mixture of calcium phosphate with a lipid coating. The study discovered that by altering the ratio of the calcium phosphate to liposome in the mixture, it was actually possible to fine-tune the surface charge of the particle.
The surface charge of the particle was measured, with the researchers discovering that while this started out with a negative charge, as the ratio of calcium phosphate to liposome was changed, it went from a negative charge through a point where there was zero zeta potential, finally ending up with a positive 40 millivolts of zeta potential.
Two things are important here. Being able to fine-tune the surface charge is important because, with this particular drug, the zeta positive potential improved both the ability for increased drug loading while also optimizing the drug’s ability to cross cell membranes during delivery.
If you have no charge on the surface of the particle, then particles can actually get close enough to start aggregating. That means that, generally, if we want to formulate stable dispersions of particles, we need to have some sort of charge, either negative or positive. Zero zeta potential usually exhibits the least stability, so this should be avoided.
We can perform the measurements I have just mentioned using dynamic light scattering. Some particles do not scatter much light, however, so one way to challenge any DLS instrument is to attempt to measure these particles.
When I first joined the company, I had struggled with making these measurements when using other instruments in the past. I started to challenge the Nicomp® with lysozyme at 0.1 mg/mL, and in one experiment, after starting at 9:00 am, I had all the data I needed by 2:00 pm. I was impressed by the sensitivity of the Nicomp® and its ability to perform measurements at low concentrations.
Can you give our readers some examples of DLS measurements in practical applications?
In a recent publication from the International Journal of Nanomedicine, the Nicomp® was used to optimize the processing of liposomes, particularly in determining their polydispersive index - polydispersity is a calculation of the width of the distribution.
This experiment used a homogenizer to attempt to standardize particle distribution as much as possible. The results confirmed a decrease in size as the pressure of the homogenizer increased. This data was helpful in terms of optimizing the ideal pressure at which to run the homogenizer.
As particles continued to cycle through the homogenizer, they kept reducing the size. When processing the liposomes using a homogenizer or microfluidizer, this is adding the energy required to enable self-assembly. Users can then take measurements and tweak the process, repeating this as necessary.
This example made use of a lab measurement to help control the process, but this technique can also be applied to conduct measurements directly in the process. Another example using BIND nanoparticles (a PLA-PEG block copolymer) required the use of a homogenization step in the process, inserting this with a view to creating these nanoparticles at a scale of about 100 nanometers.
The system in question used fluid processing to adjust the stream from downstream of the homogenizer. A diluent source was used to dilute the sample for the measurement. Measurement was taken using a DLS system. This case study also revealed that an increase in pressure in the homogenizer resulted in a decrease in particle size.
After running the experiment several times and performing a correlation study, the study found that the result was a 9 nm change in particle size per 1000 psig change in pressure.
The researchers later attempted to process a batch to explore this further. In the beginning, the particle size reduced to about 94 nm - slightly too small. Based on the results of the previous experiment, it was possible to specifically adjust the pressure until the particle size was closer to the desired 100 nm.
A third experiment involved a clinical scale study. Throughout this process, all of the particles were very close to the perfect size of 100 nm. This is an important consideration because it means that drug manufacturers processing this particular product can be confident that 100% of the product is within specification.
This is even more crucial when the value of that product is very high; for example with many of the experimental drugs currently under development.
Where do you see DLS and nanoparticle-based drug discovery going next?
Around 95% of all DLS measurements are still being performed in laboratory settings, but our Florida office is currently building a number of systems designed for experimentation and implementation in processes. It is possible to perform DLS measurements in process and in real-time, so this is an exciting development for the technology and the field.
There is a lot of interesting R&D and implementation currently taking place in developing nanoparticles to be used for drug delivery. Particle size is a key issue, and to make those measurements accurately it is important to use the best possible tool.
The Entegris Nicomp® DLS system can measure size and zeta potential with very high resolution. Entegris also offers a whole product line designed to measure oversized particles - the AccuSizer® - that can complement the Nicomp® DLS system where required.
For over 35 years, the Entegris AMH Instrumentation team has been committed to helping customers find solutions to their particle sizing problems. We offer products with unique capabilities that can size particles from single-digit nanoparticles to particles that are thousands of microns in diameter. Whether you need a particle size distribution or to find that needle in a haystack, our team of engineers and sales staff are available to provide the product knowledge and scientific expertise necessary to solve your sizing issues.