We are searching data for your request:
Upon completion, a link will appear to access the found materials.
The recent news of the Ultra Diffuse Galaxy (UDG) Dragonfly 44 is an excellent example of what could be termed 'observe different' thinking. The dragonfly telescope is noted not for the size of its collective aperture, but for the absence of the diffracting effects of secondary mirrors and surface roughness that limit the contrast of dim objects in conventional telescopes when brighter sources are nearby. See here and here and here.
above: image of a Dragonfly refractive array telescope from here. Image: P. Van Dokkum; R. Abraham; J. Brodie
above: The Dragonfly 44 ultradiffuse galaxy from here. "Dragonfly 44 is very faint for its mass and consists almost entirely of dark matter. (Pieter van Dokkum, Roberto Abraham, Gemini Observatory/AURA)"
Once identified, the radial velocities of the stars in Dragonfly 44 were measured using DEIMOS on the Keck II telescope, in order to determine a value for the mass for the dim, ultra-diffuse galaxy.
I started reading the ArXiv article but quicky got bogged down in the abstract. The very exciting result is that the luminosity and therefore total number of stars is much smaller than what one would expect from the mass obtained from the radial velocity measurements, suggesting that it is made almost entirely of dark matter. I wanted to see if I could understand how the mass was calculated, but I got stuck on the phrase deprojected half-light radius.
Could someone just outline how this calculation is done, and what that phrase actually means?
Recently a population of large, very low surface brightness, spheroidal galaxies was identified in the Coma cluster. The apparent survival of these Ultra Diffuse Galaxies (UDGs) in a rich cluster suggests that they have very high masses. Here we present the stellar kinematics of Dragonfly 44, one of the largest Coma UDGs, using a 33.5 hr integration with DEIMOS on the Keck II telescope. We find a velocity dispersion of 47 km/s, which implies a dynamical mass of M_dyn=0.7x10^10 M_sun within its deprojected half-light radius (my emphasis) of r_1/2=4.6 kpc. The mass-to-light ratio is M/L=48 M_sun/L_sun, and the dark matter fraction is 98 percent within the half-light radius. The high mass of Dragonfly 44 is accompanied by a large globular cluster population. From deep Gemini imaging taken in 0.4" seeing we infer that Dragonfly 44 has 94 globular clusters, similar to the counts for other galaxies in this mass range. Our results add to other recent evidence that many UDGs are "failed" galaxies, with the sizes, dark matter content, and globular cluster systems of much more luminous objects. We estimate the total dark halo mass of Dragonfly 44 by comparing the amount of dark matter within r=4.6 kpc to enclosed mass profiles of NFW halos. The enclosed mass suggests a total mass of ~10^12 M_sun, similar to the mass of the Milky Way. The existence of nearly-dark objects with this mass is unexpected, as galaxy formation is thought to be maximally-efficient in this regime.
The half light radius is the radius from within which half the luminosity emerges.
"Deprojected" means that the authors must have fitted some model to the 2D distribution of light, which can then be mathematically deprojected to give them a 3D model for luminosity as a function of radius, that they can then integrate to give a number for the half light radius.
In section 3, the authors explain that they have done this by fitting a "Sersic profile" to the surface brightness distribution https://en.m.wikipedia.org/wiki/Sersic_profile The Sersic profile actually has the 2D half light radius as one of its parameters. But if you imagine looking through a ball of stars, this 2D measurement of the half light radius is an underestimate of the true 3D half light radius, because the surface brightness profile is more sharply peaked than the 3D stellar density distribution that produces it.
The authors appear to approximately correct (deproject) this by multiplying the half light radius by 4/3. They also make a small correction for the non-sphericity of the galaxy.
The deprojection factor depends (slightly) on the $n$ index of the Sersic profile and needs to be found by doing a numerical integral. The details can be found in the appendices of Wolf et al. (2010. http://arxiv.org/abs/0908.2995 ), who also provide expressions to directly estimate mass from the projected half light radius and the line of sight velocity dispersion.
The half-light radius is the (spherical) radius from which half the electromagnetic power is radiated. Left unqualified it should mean the power in the entire electromagnetic spectrum, but it can also be constrained to cover a specific range of wavelengths. "Deprojected" has a straightforwrd meaning when considering a regular spiral galaxy. If you are viewing edge-on there will be more stars within a radius as viewed by you than if you were looking along the galaxies axis of rotation. Deprojected here means calculating what would be found if you were looking along the axis of the galaxy. For a spiral galaxy this radius is almost identical to the radius in the plane of the galaxy.
For other shapes, deprojection means working back to the 3D distribution of the stars based on a combination of measured data and a model of the galaxy that you are observing. (In my opinion the term deconvolving might be less confusing when dealing with bob-spherical 3D galaxies, but hisotry is what it is… )
Moving recursive calls in &ldquoalmost tail position&rdquo to true tail position
While reading Guido's reasoning for not adding tail recursion elimination to Python, I concocted this example of almost tail recursion in Haskell:
This is of course not a tail call because although the recursive call is in the "return" per se, the x + prevents the current stack from being reused for the recursive call.
However, one can transform this into code that is tail recursive (albeit rather ugly and verbose):
Here innerTriangle is tail recursive and is kickstarted by triangle' . Although trivial, it seems like such a transformation would also work for other tasks such as building a list (here acc could just be the list being built).
Of course, if a recursive call isn't in the function return, this doesn't seem possible:
But I am only referring to "almost tail" calls, ones where the recursive call is part of the return value but isn't in tail position due to another function application wrapping it (such as the x + in the triangle example above).
Is this generalizable in a functional context? What about an imperative one? Can all functions with recursive calls in their returns be transformed into functions with returns in tail position (i.e. ones that can be tail call optimized)?
Never mind the fact that none of these are the "best" way to calculate a triangle number in Haskell, which AFAIK is triangle x = sum [0..n] . The code is purely a contrived example for this question.
Note: I've read Are there problems that cannot be written using tail recursion?, so I'm fairly confident that the answer to my question is yes. However, the answers mention continuation passing style. Unless I'm misunderstanding CPS, it seems like my transformed triangle' is still in direct style. In which case, I'm curious about making this transformation generalizable in direct style.
Concealing details of violations, multiple leaks with unknown consequences
The U.S. Army Institute of infectious disease medicine in Fort Detrick has recently become the focus of attention. In July 2019, the U.S. military research institute where Ebola, anthrax and other lethal “specific biological agents and toxins” were stored was suddenly shut down. Subsequently, unexplained respiratory diseases and extremely serious influenza outbreaks occurred in nearby areas and even in the United States. The US Centers for Disease Control and Prevention (CDC) has previously found at least six violations in the fort derrick laboratory, including a number of leaks, but Washington declined to release details on the grounds of “national security.”. China’s foreign ministry has repeatedly urged Washington to respond to the concerns of the international community and the petitions of the American people and open the laboratory to the World Health Organization for investigation.
Since its establishment in 1943, there have been many accidents in the fort Detrick biological laboratory in Frederick, Maryland, including the death of civilian employees, the loss of lethal strains, the leakage of viruses and chemicals, and the involvement of scientific researchers in anthrax attacks. Residents nearby have been protesting for decades, demanding rectification or direct closure.
In June 2019, CDC found at least six violations of federal regulations on handling specific biological agents and toxins in this US military’s only biosafety level IV laboratory during routine inspection. On July 12, 2019, the CDC sent a letter expressing concern on July 15, 2019, the CDC asked the detrickburg laboratory to stop operation. However, for the sake of national security, the CDC refused to disclose the details of the violation and did not disclose the biological agents involved, causing strong public dissatisfaction.
Employees deliberately open the door to spread drugs
The US Military Times quoted the freedom of information act to ask the authorities for the CDC investigation report, but many key contents have been deleted. According to the public part, the detrickburg laboratory has failed to implement and continuously implement measures to preserve specific preparations and toxins. One of the six violations involved two leaks, but it is not known what was leaked and what the consequences were.
Another violation is the “systematic failure” of laboratory staff in implementing biosafety and control measures. CDC inspectors found that a staff member deliberately opened the door of the high-pressure sterilization room when transferring hazardous waste. “This violation increases the risk of contaminated air entering the autoclave from the & hellip & hellip room without respiratory protective equipment,” the report said. &rdquo
In addition, the sewage treatment system of the laboratory has also failed. In May 2018, the steam disinfection plant originally used in the laboratory was destroyed by the storm, and the research had to be suspended for several months until the new chemical decontamination system was put into use. However, CDC found mechanical failure and leakage in the new system during inspection.
The CDC also pointed out that the laboratory does not even have a “comprehensive and accurate” list of specific biological agents, which is reminiscent of the loss of dozens of dangerous samples such as anthrax around 1991.
Query on random deletion of reports
In February 2020, the US media disclosed that the Pentagon had recovered more than US $100 million in funds from the detrickburg laboratory and another institution, but it still did not explain why.
At the end of March 2020, with the expansion of COVID-19 in the United States, CDC approved the resumption of the biology laboratory in Fort Worth, and completely restored its qualification for developing &ldquo, specific biological agents and toxin &rdquo. Almost at the same time, a large number of English reports about laboratory closures were deleted. The suspicion of American netizens about it has also reached its peak.
American netizen b.z. listed in detail the doubtful points on the White House petition website, pointing out that before and after the closure of Fort derrick, there was an unknown respiratory disease in the nearby area. Then, the so-called “e-cigarette pneumonia” broke out suddenly in the United States, but experts and the media had doubts about the explanation of “e-cigarette induced lung injury”. Then, the United States ushered in an extremely serious influenza epidemic, and the “e-cigarette pneumonia” broke out E-cigarette pneumonia seems to have suddenly disappeared.
Several residents living near Fort Detrich said they or their families suffered from unknown respiratory diseases in the fourth quarter of 2019, and doctors have not been able to determine the cause. When COVID-19 was widely known, someone wanted to make an antibody test to confirm whether the previous infection was COVID-19, but it was rejected. The netizen complained: “it seems that [the US government] deliberately doesn’t want us to know. &rdquo