We have already reached No. 20 in our series of “Development Stories” which we started one and a half years ago. At that time, we had intended for all our developers working on different projects to each contribute an article to the series. However, even after making requests to them we sadly received no submissions. So it was left to me, as the person responsible for managing new product development, to summarize the stories behind product planning and decisions on product specifications.

This time I will be talking about a device to be used on the job site in tandem with our electronic scales and balances for easy measurement device management: the AD-1691 Weighing Device Analyzer. In order to clarify the purpose of developing this new instrument, I will first explain some of the background behind its development before I get into the main subject of the device itself.

Electronic balance technology is based upon mechatronics, a field of engineering comprised of many different elements. To be precise, it is an amalgamation of (1) mechanical technology to build the mass sensor unit, (2) electronics to achieve the high resolution of the balances, and (3) software technology, which is gaining more and more in importance. You could say these three elements balance each other in order to produce a high performance set of balances. However, recently we have reached the age where it is possible, with the purchase of a high precision electrical discharge machine (also called a wire cutting machine) or a machining center, to make copies of machine parts to a certain degree of accuracy oneself, not to mention electronic components.

What I am trying to say is that the measurement devices industry is now entering similar circumstances to those of the home electrical appliances industry. In other words, together with (1) economic globalization, Japan’s traditional strength of (2) essential component technology is flowing overseas. This essential component technology is becoming (3) black boxes, and by mass production in low-labor-cost countries, products that use those black boxes can now be produced as (4) low cost products and we are beginning to see an (5) influx of these products into the Japanese market. This means that by assembling these black boxes that constitute each component, in a similar fashion to the computer industry, anyone that can build an external case is now able to produce a commercial product in any part of the world.

These present conditions recall memories for me of when I first started my career 30 years ago. When I first started work the Japanese economy was carrying all before it and particularly in manufacturing technology Japan’s productivity was incomparable.

At the Harumi International Exhibition Center at that time, highly productive robotic equipment was the star exhibit at the trade shows there. These shows were extremely popular and many companies would be exhibiting such robots. The crowds were such that often you could only move together with a wave of people and at many times it was quite an unpleasant squash. As the years passed however, nearly all robotic manufacturers went out of business. This was a result of price wars between robotics makers, where a few companies cornered production of the essential parts, such as motors or control boards, and a similar phenomenon of production using black boxes occurred.

The robot boom of 30 years ago was mainly restricted to Japan; however you could say that we have presently entered an age where markets all move to the same standard criteria all around the world. What Japanese businesses will now have to bet their futures on is establishing creative new product planning capabilities which can create entirely new markets around the world, together with the essential component technology that will support this. That is to say, in order to survive Japanese companies will have to put their present technology and capabilities to use for innovative new product planning and development. You could also say these future strategies will have to preempt potential demand in existing markets.

To take the case of the weighing devices industry, the mass sensors of strain gauge-type load cells or analog-to-digital converters, which convert analog output from load cells to digital data, are already being sold as separate units in the market. In Asia, device makers in Japan, Korea, Taiwan, China, and as far away as Eastern Europe are all producing weighing machines with the same specs. Under these types of market conditions, balance makers from developed nations with more advanced technology might usually work on improving the functionality of the screen display or giving the device a more upscale appearance to continue to discriminate it from competitors’ products. That could be done by adding a large color liquid crystal display or touch panel functionality like a smartphone.

Developers would probably receive a lot of praise from sales reps for the enticing novelty factor of such new features. Most distributors in Japan and overseas, and even most of our own sales team, would probably share the opinion that we should follow this trend.

However, the requests we get from weighing device users on the job are quite simple: they want measuring to be (1) precise, (2) quick, (3) simple, and (4) low cost. If we were to focus solely on more aesthetic aspects of design, features like a fancy display would naturally lead to a larger device, more difficulty of use in the workplace and an increased price: all developments which of course would not be beneficial to users.

On the other hand, for people responsible for managing measurement devices we could speculate that introducing new management methods for weighing devices is essential due to a tightening of regulations. I am talking about people engaged in measurement work who, for example, might be responsible for the management of the production line or all the measurement devices at a pharmaceutical company, in a research and inspection company contracted to clinical laboratory tests, or have responsibility for the maintenance of measurement devices at some company. The sales team and field engineers within our own company would also fall into this category. To come to the point, there are two different viewpoints for people who work with measurement devices: those who handle them regularly in their everyday work activities, and those who are responsible for the management of such devices.

It’s an obvious point, but naturally people whose business purposes differ will also be making different demands regarding their workplace tools. People who are using measurement devices in their normal workday will want devices to simply display the measured values; people with responsibility for managing such devices will be more interested in making difficult management tasks including determining “uncertainty” easier.

Considering these points, we came to the decision that we should develop an analyzer for use with weighing devices as a tool for professionals, while keeping the configuration of the weighing devices themselves simple. To be more specific, we have developed the AD-1691, a purpose-built analyzer that can connect to all A&D scales and balances that have telecommunication functions.

For your reference, I have presented the resulting graphs of 24-hour monitoring we did with the AD-1691 (AND-MEET*) to measure the performance of our microbalance. In Fig. 1 you can see the front display panel of the AD-1691. It uses a colorful touch panel for data retrieval in an interactive fashion. In Fig. 2 you can see the readings for the microbalance’s repeatability, which averaged 2.8 µg over the 24 hour period. I won’t bother to explain all the details of the results here, but this example demonstrates how the AD-1691 can evaluate a balance’s repeatability performance while taking factors arising from the measurement environment into consideration at the same time.

AD-1691 display panel
Fig.1: AD-1691 display panel
AND-MEET results
Fig.2: AND-MEET results

The AD-1691 uses A&D’s unique Digital Signal Processing (DSP) technology, which you could think of in easy terms as a PC for specialized use with weighing devices. Using the AD-1691 it is possible to manage multiple balances. It has the functions of (1) data collection and calculations, as well as data file creation, for repeatability measurements; (2) data sharing with standard PCs using a USB flash drive; (3) determining the level of uncertainty of a balance at the location of use; and (4) presenting AND-MEET’s results in graph form. Further, there is no need for any additional software (including any special OS) when using the AD-1691 and the above functions can be performed simply by directly attaching it to the weighing device with an RS-232C transmission cable.

By using these features of the AD-1691, uniform control of multiple weighing devices can be easily achieved. It is also possible to avoid problems with connecting to computers or a lack of compatibility or uniformity with old measurement data from obsolete devices. As it has a guidance function in the interactive mode, if operator guidance is followed, the troublesome task of determining uncertainty, which involves many factors, should be able to be dealt with easily by the user on the spot.

By using this first of its kind specialist weighing device analyzer I believe significant increases in productivity can be achieved in weighing practices and it can contribute effectively to new levels of quality control.

*1 For information on AND-MEET, please refer to Development Story 12

Our tuning-fork vibration rheometer was first introduced to the market in September this year at the Japan Analytical Scientific Instruments Show (JASIS) 2012. As I have previously written about tuning-fork vibration viscometers, they were incorporated two years ago as one of the standards in measuring viscosity in the first such revision to Japanese industrial standards in 19 years (JISZ8803), and have also already been accredited as a viscometer subject to calibration by the Japan Calibration Service System (JCSS).

When our viscometer was first released for sale eight years ago, we received many requests from customers who wanted to know the value of the “shear rate”, or be able to adjust it in their measurements. In response, we did not change the natural frequency of the 30Hz oscillators, but instead set about changing the oscillation amplitude by developing a new rheometer: our new RV-10000 model.

Presently, almost all rheometers are rotary type devices. The main features of rotary-type rheometers are a great level of control over the variation width of the shear rate by changing the rotation frequency, and being able to apply the shear rate uniformly to the sample through configuration of the rotor. On the other hand, a lot of energy is required to rotate the rheometer and the state of the sample may be altered from that of before rotation, skewing the measurement. Liquids with low viscosity can often be displaced by the centrifugal force from rotations and problems with repeatability can also occur with rotary rheometers.

With the rheometer developed from the vibration viscometer with adjustable shear rate, the shear rate is varied by changing the amplitude of the oscillating sensor plates. The shear rate can be adjusted even by changing the frequency of vibration, but the vibrating tuning-fork type device increases sensitivity through sharp resonance phenomena and is inviting a reduction of sensitivity if the oscillation frequency is changed. Therefore the oscillation amplitude was made adjustable, rather than the frequency.

As a result, the maximum displacement between the peaks and troughs of the oscillating sine curve motion forms a range of between 0.07mm–1.2mm, focusing around 0.4mm on the viscometer. Results for shear rate at that time can be gained for Newtonian fluids within an approximate 10/sec to 1000/sec range.

For a vibration viscometer, similarly to a rotational-vibration rheometer, the shear rate constantly changes. Therefore, the displacement per unit time is expressed by the root mean square of one cycle. Also, with the vibration type viscometer there is no precise opposite surface to define shear rate. So from the known viscosity of a set liquid, such as water or a Japanese viscosity standard solution, and the “shear stress” calculated by dividing the drive force of the oscillators required to measure that viscosity by the area of the wetted surface the shear rate is obtained.

While it is just a personal opinion, even with rotary type viscometers with a cone-plate that geometrically has a fixed opposing surface (E type), there is no guarantee that the shear rate between opposing surfaces is maintained at a constant value. In other words, even if you logically apply a fixed shear rate to the area filled up with the liquid, depending on the quality of the material of the plate and the sample to be measured, or the condition of the surface of the sample, issues like “slide” or “adhesion” may occur at the interface between the plate and the sample. As a result, there will not be a uniform laminar flow in a depth wise direction, and possibly no linear shear rate applied. Moreover, with non-Newtonian fluids, I am guessing that as well as their particular non-linear characteristic related to the shear rate attributable to the sample material there is also a damping of the shear rate, with these effects having an influence on the measured viscosity.

The definition of viscosity as the relative motion of two parallel plates is quite unambiguous, however when it comes to actually putting this test into practice many problems can arise, such as the effect of the plate edge or theorizing how the plate will actually move in the experiment. Even with rotary type devices, some problems can be faced, such as the behavior of the liquid to centrifugal force around the periphery.

For the thoughts of our company regarding the shear rate of the tuning-fork vibration viscometer or rheometer, please refer to our website for reports from academic conferences or for technical data.

I hope you would excuse my rather long preamble, but we have obtained rather interesting data by measuring the viscosity of various liquids while changing the shear rate of the tuning-fork vibration rheometer. This includes some interesting topics such as why it is possible to run over a water solution of corn starch or potato starch (dilatants); the behavior of Bingham fluids of reducing their viscosity in response to an increase in shear rate; and the thixotropic properties of ketchup, where it is at first hard to squeeze out and then later starts to flow too quickly. Our results show you can undoubtedly measure the behavior of such non-Newtonian liquids as these quite easily now with our new rheometer.

I will not provide a detailed explanation of the results here, but hopefully the three graphs below will clearly demonstrate the measurement data.

Rheometer graph 1 and 2
Rheometer graph 3

Graph 1 demonstrates a sharp increase in the viscosity of a 62% cornstarch solution when a certain vibration amplitude (shear rate) is applied. In Graphs 2 and 3, the characteristic features of a Bingham fluid or thixotropic fluid are confirmed in hand cream and ketchup respectively.

I would also like to share with you a personal aside regarding the development of the rheometer and tuning-fork vibration viscometer. 32 years ago, while working towards my university graduation project, I studied at the Biomacromolecular Physics Laboratory of Riken Research in Wako City, near Tokyo, for just over a year. The director of the laboratory at that time was Professor Eiichi Fukada, who would later go on to become the head of Riken Research. At the time, I’m rather ashamed to admit but I was quite obsessed with mountain climbing and did not put much effort into my studies.

After graduating, I changed jobs a few times before finally arriving at A&D. During the first 15 years or so, I was responsible for the development of some of our electric scales and balances. Later, I developed our tuning-fork vibration viscometer, and this time around our rheometer.

While developing this rheometer, I paid a visit for the first time in 30 years to my previous advisers now working at the Kobayashi Institute of Physical Research, Professors Munehiro Date, Eiichi Fukada and Takeo Furukawa, for some technical advice. When I heard for the first time from Professor Fukada that a long time ago he too had been involved in the test production of a vibration viscometer I couldn’t quite believe my ears. Shortly after, he sent me the research paper from that trial. As it turns out, that paper related to the theoretical development and actual measurement data of a vibration viscometer nearly 60 years ago! It was from research Professor Fukada conducted in his early 30s, around the time I was born. I was incredibly surprised to read that the content of the paper was virtually identical to the research we had been doing in the development of our own tuning-fork vibration viscometer. It also felt extremely fateful that these events of 60 years ago, and studying under Professor Fukada 30 years ago, should come together for the development of our new version.

For a long time, rheometers were essentially conceived of as a rotary type device. However, there is now a need for several types of measurements which are extremely challenging for a rotary rheometer, such as viscosity measurements with minimum changes to the physical properties of a liquid, a hysteresis measurement of viscosity caused by changes in the shear rate over a short period, measuring the viscosity behavior of a liquid due to change in temperature, time-dependent change and so forth.

With the development of this new tuning-fork vibration rheometer I believe it has now become possible to actually measure the physical properties of liquids in such manners. The establishment of a new method of measurement like this will bring new precision to experiment results and often contribute to new discoveries and inventions.

It came to our knowledge almost incidentally that a high level of research on vibration viscometers has been conducted in Japan for such a long time. With this historical background in Japan, and Japan’s position as the birthplace of the tuning-fork vibration rheometer, we hope that it will prove a very effective measurement device for evaluating the physical properties of liquids and will find many uses across several fields from now.

Analytical balances are very sensitive. Therefore, they are heavily affected by the environment in which they are installed and the way measuring personnel handle them. With regards to methods for assessing the environment, running AND-MEET (*1) will yield a judgment and assessment, and from there a concrete process for improving the environment can be proposed. In addition, in “Development Story 17”, I explained a method for selecting a location for measuring instruments. So, for this edition of “Development Story”, I will discuss proper handling with a focus on analytical balances.

The basic motto for weighing instrument operation is “quick and accurate”. So in the case of someone taking time to slowly open and close a breeze break door, these words would tell us that such is not an optimal way of conducting measurement. It means that by increasing the time that the breeze break door is open, the air within the breeze break changes, and the weighing area’s temperature will change. From amongst the analytical balances, I will use the microbalance, capable of measuring 1 millionth of a 1 yen coin (1 gram), as an example.

For instance, in bioscience research fields, many labs use micropipettes. Even with pipettes, we know that without accurate and experienced handling, random errors occur, and if there is a problem with the pipette itself then systematic errors can occur. The minimum capacity for a micropipette is around 1 to 2 μL. 1 μL is an extremely small amount compared to what we are used to in our daily lives. However, if 1 μL of water is converted into mass, it becomes 1 mg, and a sensitivity of 1 mg is a minimum display for general-purpose precision balances in the weighing instrument industry. Yet microbalances can measure 1/1000 of this unit – an ability to measure where 1 digit = 1 μg is stipulated. In other words, determining 1 μg is more difficult than using micropipettes, so it can be said that it’s clear that experience and accuracy are demanded in measurement operation.

Electronic balances deliver weighing results via digital display: from general-purpose balances with a minimum display of 10mg or 1mg to analytical balances with minimum displays of 0.1/0.01/0.001mg. Because of this, people think that if the weighing sample is simply placed on the pan, an accurate weighing result will be displayed instantly. However, with readability that is orders of magnitude more precise than the minimum capacities of micropipettes, one must question whether the result displayed is really correct, and recognize that instability in the displayed result can be quite natural depending on how the instrument is operated.

Therefore, I will explain how errors in weighing occurred using actual examples from weighing sites.

1) Effects of static electricity

For weighing instruments used in production lines using automated machines or at sites conducting plastic injection molding, there have been instances of displays becoming unstable, or measurements changing in one direction with the passage of time. This phenomenon is called “drift” in the weighing industry. Currently, in the production processes for pharmaceutical manufacturing, primary and secondary batteries, electronic parts such as IC chips and LEDs, and resin molds, many weighing instruments are used for quality control. But on these production lines, the environment is usually like that of a clean room, and we have confirmed many areas with 24-hour air conditioning and humidity levels sometimes below 20% due to the undesirability of moisture. In other words, it is dry, and friction from insulated material caused by the moving around of objects causes static electricity to build up easily. Moreover, people working on the line or research can sometimes build up around 10,000 volts of electricity themselves. Under these circumstances, the effects of static electricity become greater, and errors of a few dozen milligrams can easily occur. (*2)

If the humidity in the weighing instrument’s installation environment cannot be increased over 40% or if electrical build up occurs faster than electrical discharge, please introduce a static eliminator and conduct weighing operations after actively removing charges from the weighing sample.

2) Effects of temperature

Let’s say you check the quality of a molded item right after resin molding it by measuring it on a weighing instrument, or you measure out some pharmaceuticals into a handheld vial and then weigh it, or you take a sample from another location and bring it in to measure it right away. In scenarios like these, a difference between the weighing sample’s temperature and the weighing area’s temperature will occur. This temperature difference will become a weighing error. The reason for this is that when the sample’s temperature is higher than the room temperature, a layer of warmer air is created around the sample, and a slight upward air current is created. That air current has the effect of pushing the weighing sample up, and the weighing measurement will come up light at first. When the sample later reaches room temperature, the original weight will be displayed.

It depends on the temperature difference and the shape/material of the sample, but weighing errors on the order of a few dozen milligrams can occur.

thermograph image
Fig.1 Thermograph image of a container after being gripped by hand

Fig.1 shows the results of thermograph observations of a coffee can placed on an analytical balance. The can has been gripped for a few dozen seconds before being placed on the pan. Metal is especially conductive of heat, and a deviation from ambient temperature of a few °C can occur in a short period of time. It is known that the convection current generated by this temperature difference will affect weight measurements. (*3)

I experienced this personally more than 10 years ago, when we were setting up mass production of weights with tolerances conforming to the OIML Class E2 standard and compositions conforming to the Class F1 standard. In this instance, we found that our weight-adjusted 200 g weight had grown heavier on a 0.1 mg level the next day. We had touched the weight with gloves, but after conducting weight adjustment and screw tightening, we found that our body heat had warmed up the weight slightly. It was a good experience for me to understand why people say it is not good to touch weights directly with one’s hand. In places where weighing instruments are being used, one may see people picking up weights with gloves and calibrating, but at least when it comes to analytical balances, we recommend doing calibration and performance checks using tools such as tweezers.

3) Effects from work in the weighing area

Analytical balances come standard with a breeze break. It’s there to stop drafts and maintain stability within the weighing area. However, if the breeze break’s door is operated roughly, an impact will occur at the end of the swing and the force will reach the balance’s weight sensor. This can result in variations in the zero point, and risks reductions in repeatability. But if the door is operated too slowly, the time when the door is opening and closing lengthens, and the air within the weighing area will be replaced. As a result, the temperature can become unstable and become another factor in the reduction of repeatability.

People’s hands exceed room temperature, and placing one’s hands in the weighing area can cause a disturbance in temperature. For this reason, the door should not be opened longer than necessary, the door should be operated accurately within a short period of time, and long tweezers need to be employed to avoid placing one’s hands in the weighing area as much as possible.

As an aside, I’ve searched far and wide for an off-the-shelf set of long tweezers that are usable for calibrating weighing instruments. However, I was unable to find anything fitting the description. It was then that I independently drew up plans for an ideal set of tweezers, and gave the manufacturing contract to a manufacturer near Tsubamesanjo in Niigata Prefecture. In the production of these AD-1689 tweezers, special regional production techniques that made Japan the #1 producer of eating utensils such as spoons and forks have been used. The original “monotsukuri” (craftsmanship, artisanry) techniques found throughout Japan contain skills passed down by craftsmen for more than 150 years, and it is thought that these techniques have supported Japan’s economic growth from the Meiji Restoration to the present day. I believe that continuing to support these techniques is absolutely essential to maintaining the Japanese economy going forward.

  • But I digress. I’ve summarized weighing instrument operation methods in a simple form below.
  •  When conducting weighing using a balance, special care must be taken with regards to the weighing sample’s static charge and temperature.
  •  It is especially necessary to actively introduce a static eliminator to take care of static electricity trouble in dry environments with humidity of less than 40%.
  •  For the weighing sample as well, care in controlling the temperature is needed, including measures such as not touching the sample directly with one’s hand. It is important to place the weighing sample in the weighing area beforehand, and allow it to adjust to the temperature there before commencing weighing.
  •  Weight measurement should be conducted quickly and accurately, the weighing area door should be opened as little as possible, and one’s hand should not be inserted into the weighing area.

Reading the precautions above, one may feel heavy with the difficulty of operating an analytical balance. However, please rest at ease. Lately, multiple analytical balances with internal static eliminators are on the market. Additionally, there are models which feature a weighing preparation room where the weighing sample can be placed to allow it to adjust to the temperature. And there is also the set of long tweezers for weighing operations which I wrote about.

Regarding precautions aside from weighing operations, it is necessary to connect the balance to a power source the day before weighing to ensure that it is stable. In the case of weighing instruments at the semi-micro level and below, it can take from 6-8 hours for a connected machine to completely adjust to the room temperature. Additionally, one must do as much as they can to ensure that vibrations, pressure changes, temperature changes and humidity changes do not occur in the weighing room. As part of this, foot traffic in and out of the room should be reduced as much as possible.

Lastly, regarding handling of weighing instruments, the characteristics of electronic components of electronic balances become more stable the longer the instrument is hooked up to a power source. The thermal distribution within the device, including the weighing chamber, will become even. Since these instruments do not use much electricity, I recommend continuous connection to a power source if possible.

I believe that going forward, weighing instrument manufacturers shouldn’t just conduct development that’s all about how good the performance is or how many features there are, but that they should also come up with solutions which are easier to use on-site and which also include peripherals to display and reduce weighing errors. Moreover, this proposal means providing a comprehensive weighing and measuring service with everything from analysis to assessment, using environmental measurement, communication utilities, data management, graphing functions, and more. What is important to manufacturers at such a time is knowing the weighing and measurement market that forms the actual usage locations for these instruments. I would like to continue emphasizing market surveys and providing original products according to principles emphasizing the best results for all parties involved.

*1 Regarding AND-MEET: Please refer to the 28th Sensing Forum: Investigation of the Basic Performance of Analytical Balances (PDF 1.28MB)
*2 Regarding the effects of static electricity: Please refer to Training Material for Balances (1) (PDF 437KB)
*3 Regarding the effects of temperature on weighing samples: Please refer to Training Material for Balances (1) (PDF 437KB)

In this edition, I want to talk about the installation environment for analytical balances, which have seen a lot of trouble on the market. In the next edition, I’ll be talking about how to properly use an installed analytical balance and take measurements accurately, using knowledge gained from actual usage scenarios.

Because analytical balances are very sensitive, the environment in which they are installed will affect them a great deal. For the same reason, we know that the way in which operators handle the balances also has a large effect. As far as assessing the measuring environment is concerned, thanks to our measurement environment evaluation tool option “AND-MEET” (*1) and our market response, it’s possible to get a clear idea of how to improve the measuring environment from a balance installation environment assessment. So for this edition, I’ve put together some general information about installation environments.

Since the March 11 Disaster, we continue to have frequent earthquakes in eastern Japan. This is a special concern for analytical balances capable of measuring on a microgram level, for not only do they pick up earthquakes, but also things such as movement of people, handcarts, and forklifts, as well as vibrations and changes in room air pressure from the opening and closing of doors.

As for weather effects, the force of wind from passing low pressure systems like monsoons and typhoons can cause problems due to buildings shaking, which becomes an even greater issue on higher floors. Buildings built with a quake-absorbing structure, which have become more common recently, are designed with shaking as a given. Such structures can shake for days due to wind pressure or earthquakes.

For situations like these, we have confirmed that passive anti-vibration tables such as the AD-1671 improve issues with repeatability. On the other hand, we have found that despite their high cost, active air suspension anti-vibration tables used for optical measuring instruments actually become a source of vibration, and negatively affect analytical balances.

Administrators of balances often ask us about the permitted specifications of an installation environment for an analytical balance. A&D recommends the following: (1) daily fluctuation of temperature of 4°C or less (within 10 – 30°C) and short term fluctuations of 0.2°C/30 minutes or less, (2) daily fluctuation of humidity of 10% or less, and (3) daily fluctuation of air pressure of 10 hPa or less. In particular, regarding the short term temperature fluctuations written about in (1), it is known that the repeated slight changes in temperature caused by air conditioning have an especially destabilizing effect on balances’ zero-point display. To cite an extreme example, our data shows that even within the sort of windy environmental setup specified by the Ministry of the Environment’s Manual for Continuous Monitoring of Air Pollution (a.k.a. PM2.5), using the AD-1672 tabletop breeze break (which surrounds the balance area) can have such an improving effect that catalog specifications for the microbalance can be met. (*2)

Allow me to explain proper installation of a balance using an actual example. Fig. 1 is a rough sketch using the seminar room in our R&D center’s 2nd floor as a model. The seminar room is about 10 meters long on each side, and there are multiple air conditioning units in the central area of the ceiling. It would be rather large for a lab, but it resembles many labs in terms of the layout of things such as the air conditioners and lab tables. I’ve numbered the tables in this diagram from #1 – #16. I would like you as well to think about which spot is the best place in the seminar room (lab) to install a balance.

laboratory layout
Fig. 1 – Diagram for evaluating balance placement

To select a location for the balance, first we must find a location that minimizes temperature fluctuations, which have the greatest effect on balances’ performance. To be more precise, a place that is (1) out of direct sunlight and (2) far away from air conditioner vents. Next, we select a (3) corner of the room next to the wall. The center of a room has weaker construction, and the floor tends to shake more easily. However, there tend to be structural supports in the corners of a room, and they tend not to shake easily. In addition, even if room temperature is being controlled at a certain level, floors and walls often go below room temperature, especially during winter. Level temperature means that the temperature is evenly distributed (the flow of heat is even), but in the case of walls which have outside air on the other side, balances near that wall may be constantly subjected to outside temperature variations. For the same reason, installation should not be done near window glass. That is why it is best to (4) install the balance near a wall which has another room on its opposite side. As for the table on which it is installed, (5) a hard balance table with high heat capacity should be used, and (6) the balance table should be separated by a few centimeters from the wall and other tables in order to isolate it from heat and vibration coming from the wall and floor. A (7) dead end area with low foot traffic should be selected because people tend to come and go through the central part of a room. To further reduce people’s influence, (8) an area far away from the door should be used, on a table where (9) only measurement is conducted in order to prevent vibration from people’s actions from affecting the balance. Additional preconditions are that the room and wall where the balance is located should be (10) far away from routes with high traffic or heavy objects moving, and (11) on as low a floor as possible.

Using the above conditions, we can determine that within Fig. 1, the best areas in the room to place a balance are #3 as well as #2, the areas where the effects of direct sunlight are low, air conditioner vents and windows are far away, routes where people move and doorways are far away, and near an area where structural materials such as supports are installed. Issues with #3 include being near a wall to the outside, and near a wall to a hallway, but I believe that will not be an issue because only people pass through the hallway.

The above constitutes a general assessment of balance installation environments, but labs often have individual circumstances, such as having a heat-treating furnace, or there being a lot of people coming and going during the day, and so on and so forth. Ultimately, running AND-MEET in the locations where balances are to be placed, assessing the environment there, making any problems clear and developing concrete measures to deal with them is thought to be the best course of action.

To sum up the above, here is what is required of a balance installation environment.

  • Be especially sure to consider the room temperature stability, and do not place a balance near an air conditioner vent in order to reduce the effects of temperature variations. If there is no other option, then utilize things such as tabletop breeze breaks or partitions to cut off direct wind.
  •  The balance should be placed in an area out of direct sunlight, away from routes people use and away from persons working on other things. To minimize the effects of vibrations, the central area of a wide floor should be avoided, and an area near the building’s supports should be selected. At this point, the balance table should be separated a few centimeters from walls and supports in order to isolate it from vibrations and heat from the building.
  •  To reduce effects from vibrations, the balance should be placed in a location as far as possible from paths for moving heavy objects.
  •  The building will shake when low pressure systems cross the area, so install the balance on as low of a floor as possible. In addition, to reduce the effects of the building shaking due to earthquakes and vibrations, an anti-vibration table should be installed.

40 years have passed since the balance was transformed into an electronic device using microcomputers. Since then, digitalization has progressed, and the balance has come to be regarded as an instrument which can be easily used by anyone. However, at present, analytical balances have a resolution of 1/20,000,000 or more, and a certain level of skill and preparation is required to enact exact measurements. In particular, with regards to balance installation, there are many matters to be taken into consideration. I hope this article will help you understand the best environment, install the machine and set up the environment, and allow you to conduct reliable measuring work.

*1 Please refer to Development Story 12: Solutions Provided by the BM Series Part II
*2 Please refer to the 28th Sensing Forum: Investigation of the Basic Performance of Analytical Balances (PDF 1.28MB)

A&D released the newly developed HR-AZ/HR-A series of analytical balances in January 2012. Before discussing the development of these analytical balances, I’d like to talk about the definition of an analytical balance.

In the measurement instrument industry, all weighing instruments are called “scales”. Within the category of “scales”, instruments that have a fulcrum in their mass sensor section and a mechanism to balance the object being weighed via the fulcrum are defined as “balances”. Balances can weigh the lightest weights among weighing instruments. A high resolution (capacity divided by minimum display) can be achieved thanks to the mechanism to counterbalance the object via a fulcrum and return the beam that is supported by the fulcrum back to its original balance point – a function commonly called the zero method.

There are two types of balances: general purpose and analytical. Generally, general purpose balances have a minimum display of 1 mg or larger and analytical balances have a minimum display of 0.1 mg or smaller. Incidentally, minimum display is expressed as “dig” and analytical balances express their minimum display in the following manner: “1 dig = 0.1 mg”. Traditionally, one finger, or digit, was used to express a minimum unit. “Digit” was shortened to “dig” and used to express the minimum display digit for balances.

As you know, standard analytical balances have a capacity of 200 g and a minimum display of 0.1 mg. For example, weighing in the pharmaceutical industry for Chinese herbal medicine and Western medicine is performed with a minimum display of 10 and 1 mg, respectively, so balances have historically been required to have one more digit of precision in their measurement performance, or 0.1 mg.

Let’s consider the resolution of a standard analytical balance like the one above with a specification of 200 g × 0.1 mg.

   200,000 mg ÷ 0.1 mg = 2,000,000

In other words, this analytical balance has a resolution of 2 million to one. The resolution of measuring instruments is usually around 0.1 to 0.01%, which is a resolution of 10 thousand to 1. For example, even a micrometer of a mechanical contact method, the resolution is merely several hundreds of thousands to one.

We can better understand the high resolution of an analytical balance using length as an example. The distance between Tokyo and Osaka is 500 km. Therefore, 1 dig at 2 million to one can be calculated as follows:

   500 km = 500 × 1000 m ÷ (200 × 10,000) = 0.25 m = 25 cm

Accordingly, this balance can be said to have the ability to measure the distance between Tokyo and Osaka in increments of about 22 cm (or the distance between an outstretched thumb and little finger).

If we used Mt. Fuji as an example, the increment is as follows:

   3776 m ÷ (200 × 10,000) = 0.019 m = 1.9 cm

In this case, if we were to slice Mt. Fuji by height into 2 cm slices (about the width of a thumb), the balance would be able to detect each slice. From these examples, we can see that determining the minimum display digits of an analytical balance is not easy.

After that slightly long introduction, let’s return to the development of the HR-AZ/HR-A series of analytical balances. As we mentioned in another installment, the current HR series was developed around 20 years ago as our first top-loading analytical balance. At the time, an analytical balance meant an instrument with a large mass senor in the back and a weighing chamber in the front. This type of balance had an extremely large mechanism so it had a high heat capacity and the magnetic circuit to generate balancing power was powerful so its lever ratio was small. Because of its high thermal stability and performance, it is still well regarded as a high-precision analytical balance. In terms of sales volume, however, the top-loading, general-purpose balance systems that first came out 20 years ago are now mainstream items even in the case of analytical balances.

The HR series, which pioneered top-loading analytical balances, became a very long-selling product that continues to sell today, even after its contemporary competitors have long since disappeared. However, it is true that as the years have passed, its liquid crystal display and overall design have made it seem a little old. Furthermore, analytical balances are involved in a fierce price war outside Japan, particularly in Asia, and the HR series has long been due for a price and feature refresh.

Under these circumstances a long development process started. Since the cost of developing a new mass senor would be in the hundreds of millions of yen, I initially thought it would be best to repurpose an existing sensor. I came to the easy conclusion that if we reused a sensor, it would be relatively easy to come up with a new product in terms of time. Consequently, we started development of a new analytical balance with 1 dig at 0.1 mg using the C-SHS mass sensor developed for general purpose balances. The C-SHS sensor developed for the FZ-i/FX-i series has excellent span stability and other performance features, and I thought that it would be possible to get the performance we needed quickly.

However, once we started, we struggled hard to achieve reproducibility at 0.1 mg. When I look back now, I can see that I didn’t realize that we needed to develop breakthrough technology in two areas: the sensitivity of the machine parts that make up the sensor and the electronic circuits of existing general purpose balances.

We spent hours achieving basic performance. We changed the way we processed parts to overcome processing limits and tested repeatedly to limit the variation caused by the individual differences in characteristics of the electrical parts. After repeated confirmation testing and discussions, we managed to reach our product goals. Nevertheless, it took three full years from the start of development to reach the sales stage.

While I had plenty of experience developing new products by this time, when we hit the required level of performance after a long period of resignation among the development team, it felt like we had finally found the light at the end of a long tunnel that we thought was a dead end.

HR-AZ/HR-A Series

For the newly developed HR-AZ/HR-A series of analytical balances, we raised the capacity of the existing HR series of 200 g to 250 g and achieved a high-speed response with real-world weighing stabilization of 2 seconds. For the display, we used a reverse-backlit LCD for visibility of weighing values in low light settings. The AZ series uses a highly reliable internal calibration weight mechanism that employs a unique operation method. The breeze break comes as standard and detaches with one touch for ease of cleaning and operation without a breeze break. Furthermore, all components of the breeze break are made of plastic and coated with a durable antistatic coating. These features were added to solve two issues: static electricity, the most troublesome aspect of using an analytical balance, and requests for a glass-free product, which is required for medicine and food production lines subject to FDA/HACCP regulations.

The breeze break also offers installation advantages, such as the ability to set up the balance with its rear surface close to the wall, since its doors do not come out at the rear of the balance when the side doors are opened. In addition, the balance can be embedded in an automatic production line and placed in a particular isolated space thanks to the one-touch, removable breeze break and the compact size of the balance.

While it took longer than expected to develop, we believe we have created a balance that is unprecedented in price/performance ratio, simplicity, compactness, and usability. As a result, we expect the new HR-AZ/HR-A series of analytical balances to offer users a number of convenient features that create opportunities for weighing in new fields and open up new markets.