ephemeralnewyork posted: " "The elevated railroad, perpetually 'tearing along' on its stilted, aerial highway, was 'an ever-active volcano over the heads of inoffensive citizens," wrote one Australian visitor who came to New York in 1888. 38 Greenwich Street in 1914 That des"
"The elevated railroad, perpetually 'tearing along' on its stilted, aerial highway, was 'an ever-active volcano over the heads of inoffensive citizens," wrote one Australian visitor who came to New York in 1888.
That description gives us an idea of the feel of Gotham in the late 19th century, when steam-powered (later electric) elevated trains carried by trestles and steel tracks ran overhead on Ninth, Sixth, Third, and Second Avenues.
The upside to the elevated was obvious: For a nickel (or a dime during off hours), people could travel up and down Manhattan much more quickly than by horse-drawn streetcar of carriage. New tenements, row houses, and entertainment venues popped up uptown, slowly emptying the lower city and giving people more breathing room.
The downside? Dirt and din. The trains and tracks cast shadows along busy avenues, raining down dust and debris on pedestrians. (No wonder Gilded Age residents who could afford to changed their clothes multiple times a day!) And then there was the deafening noise every time a train chugged above your ears.
Now as unpleasant as the elevated trains could be in general, imagine having the tracks at eye level to your living quarters. Life with a train roaring by at all hours of the night was reality for thousands of New Yorkers, particularly downtown on slender streets designed for horsecars, not trestles.
"The effect of the elevated—the 'L' as New Yorkers generally call it—is to my mind anything but beautiful," wrote an English traveler named Walter G. Marshall, who visited New York City 1878 and 1879.
"As you sit in a car on the 'L' and are being whirled along, you can put your head out of the window and salute a friend who is walking on the street pavement below. In some places, where the streets are narrow, the railway is built right over the 'sidewalks'...close up against the walls of the houses."
Maybe these unfortunate New Yorkers lived in a tenement before the trains came along, and they couldn't find alternative housing after the elevated was built beside their building. Or perhaps in the crowded city teeming with newcomers at the time, a flat next to a train was the best they could find with what little they had to spend.
Wrote Marshall: "The 19 hours and more of incessant rumbling day and night from the passing trains; the blocking out of a sufficiency of light from the rooms of houses, close up to which the lines are built; the full, close view passengers on the cars can have into rooms on the second and third floors; the frequent squirting of oil from the engines, sometimes even finding its way into the private rooms of a dwelling-house, when the windows are left open—all these are objections that have been reasonably urged by unfortunate occupants of houses who comfort has been so unjustly molested...."
Eye-level elevated trains continued into the 20th century, with above ground subway tracks as well as older els making it more likely that New Yorkers could find themselves with a train rattling and shaking their windows.
And it's still an issue today, of course, even with those original el lines long dismantled. Tenements and apartment buildings near bridge approaches, tunnel entrances, and above ground subway tracks are still at the mercy of mass transit in a city still of narrow streets, single pane windows, and rickety real estate.
[Top photo: MCNY x2010.11.2127; second photo: New-York Historical Society; third photo: MCNYx2010.11.4; fourth photo: CUNY Graduate Center Collection; fifth photo: MCNY MNY38078; sixth photo: MCNY MN11786]
Shashikant tyagi posted: " WAVELET TREES INTRODUCTION- The Wavelet Tree is a relatively new, but versatile data structure, offering solutions for many problem domains such as string processing, computational geometry, and data compression. Storing, in its basic form, "
WAVELET TREES INTRODUCTION- The Wavelet Tree is a relatively new, but versatile data structure, offering solutions for many problem domains such as string processing, computational geometry, and data compression. Storing, in its basic form, a sequence of characters from an alphabet enables higher-order entropy compression and supports various fast queries. A wavelet tree is a […]
damienbod posted: " This post shows how to implement an ASP.NET Core Razor Page application which authenticates using Azure B2C and uses custom claims implemented using the Azure B2C API connector. The claims provider is implemented using an ASP.NET Core API application and"
This post shows how to implement an ASP.NET Core Razor Page application which authenticates using Azure B2C and uses custom claims implemented using the Azure B2C API connector. The claims provider is implemented using an ASP.NET Core API application and the Azure API connector requests the data from this API. The Azure API connector adds the claims after an Azure B2C sign in flow or whatever settings you configured in the Azure B2C user flow.
An Azure App registration is setup for the ASP.NET Core Razor page application. A client secret is used to authenticate the client. The redirect URI is added for the app. This is a standard implementation.
Setup the API connector
The API connector is setup to add the extra claims after a sign in. This defines the API endpoint and the authentication method. Only Basic or certificate authentication is possible for this API service. Both of these are not ideal for implementing and using this service to add extra claims to the identity. I started ngrok using the cmd and used the URL from this to configure Azure B2C API connector. Maybe two separate connectors could be setup for a solution, one like this for development and a second one with the Azure App service host address and certificate authentication used.
Azure B2C user attribute
The custom claims are added to the Azure B2C user attributes. The custom claims can be add as required.
Setup to Azure B2C user flow
The Azure B2C user flow is configured to used the API connector. This flow adds the application claims to the token which it receives from the API call used in the API connector.
The custom claims are added then using the application claims blade. This is required if the custom claims are to be added.
I also added the custom claims to the Azure B2C user flow user attributes.
Azure B2C is now setup to use the custom claims and the data for these claims will be set used the API connector service.
ASP.NET Core Razor Page
The ASP.NET Core Razor Page uses Microsoft.Identity.Web to authenticate using Azure B2C. This is a standard setup for a B2C user flow.
The main difference between an Azure B2C user flow and an Azure AD authentication is the configuration. The SignUpSignInPolicyId is set to match the configured Azure B2C user flow and the Instance uses the b2clogin from the domain unlike the AAD configuration definition.
The index Razor page returns the claims and displays the values in the UI.
public class IndexModel : PageModel { [BindProperty] public IEnumerable<Claim> Claims { get; set; } = Enumerable.Empty<Claim>(); public void OnGet() { Claims = User.Claims; } }
This is all the end user application requires, there is no special setup here.
ASP.NET Core API connector implementation
The API implemented for the Azure API connector uses a HTTP Post. Basic authentication is used to validate the request as well as the client ID which needs to match the configured App registration. This is weak authentication and should not be used in production especially since the API provides sensitive PII data. If the request provides the correct credentials and the correct client ID, the data is returned for the email. In this demo, the email is returned in the custom claim. Normal the data would be returned using some data store or whatever.
[HttpPost] public async Task<IActionResult> PostAsync() { // Check HTTP basic authorization if (!IsAuthorized(Request)) { _logger.LogWarning("HTTP basic authentication validation failed."); return Unauthorized(); } string content = await new System.IO.StreamReader(Request.Body).ReadToEndAsync(); var requestConnector = JsonSerializer.Deserialize<RequestConnector>(content); // If input data is null, show block page if (requestConnector == null) { return BadRequest(new ResponseContent("ShowBlockPage", "There was a problem with your request.")); } string clientId = _configuration["AzureAdB2C:ClientId"]; if (!clientId.Equals(requestConnector.ClientId)) { _logger.LogWarning("HTTP clientId is not authorized."); return Unauthorized(); } // If email claim not found, show block page. Email is required and sent by default. if (requestConnector.Email == null || requestConnector.Email == "" || requestConnector.Email.Contains("@") == false) { return BadRequest(new ResponseContent("ShowBlockPage", "Email name is mandatory.")); } var result = new ResponseContent { // use the objectId of the email to get the user specfic claims MyCustomClaim = $"everything awesome {requestConnector.Email}" }; return Ok(result); } private bool IsAuthorized(HttpRequest req) { string username = _configuration["BasicAuthUsername"]; string password = _configuration["BasicAuthPassword"]; // Check if the HTTP Authorization header exist if (!req.Headers.ContainsKey("Authorization")) { _logger.LogWarning("Missing HTTP basic authentication header."); return false; } // Read the authorization header var auth = req.Headers["Authorization"].ToString(); // Ensure the type of the authorization header id `Basic` if (!auth.StartsWith("Basic ")) { _logger.LogWarning("HTTP basic authentication header must start with 'Basic '."); return false; } // Get the the HTTP basinc authorization credentials var cred = System.Text.Encoding.UTF8.GetString(Convert.FromBase64String(auth.Substring(6))).Split(':'); // Evaluate the credentials and return the result return (cred[0] == username && cred[1] == password); }
The ResponseContent class is used to return the data for the identity. All custom claims must be prefixed with the extension_ The data is then added to the profile data.
public class ResponseContent { public const string ApiVersion = "1.0.0"; public ResponseContent() { Version = ApiVersion; Action = "Continue"; } public ResponseContent(string action, string userMessage) { Version = ApiVersion; Action = action; UserMessage = userMessage; if (action == "ValidationError") { Status = "400"; } } [JsonPropertyName("version")] public string Version { get; } [JsonPropertyName("action")] public string Action { get; set; } [JsonPropertyName("userMessage")] public string? UserMessage { get; set; } [JsonPropertyName("status")] public string? Status { get; set; } [JsonPropertyName("extension_MyCustomClaim")] public string MyCustomClaim { get; set; } = string.Empty; } }
With this, custom claims can be added to Azure B2C identities. This can be really useful when for example implementing verifiable credentials using id_tokens. This is much more complicated to implement compared to other IDPs but at least it is possible and can be solved. The technical solution to secure the API has room for improvements.
Testing
The applications can be started and the API connector needs to be mapped to a public IP. After starting the apps, start ngrok with a matching configuration for the HTTP address of the API connector API.
ngrok http https://localhost:5002
This URL in the API connector configured on Azure needs to match this ngrok URL. all good, the applications will run and the custom claim will be displayed in the UI.
Notes
The profile data in this API is very sensitive and you should use maximal security protections which are possible. Using Basic authentication alone for this type of API is not a good idea. It would be great to see managed identities supported or something like this. I used basic authentication so that I could use ngrok to demo the feature, we need a public endpoint for testing. I would not use this in a productive deployment. I would use certificate authentication with an Azure App service deployment and the certificate created and deployed using Azure Key Vault. Certificate rotation would have to be setup. I am not sure how good API connector infrastructure automation can be implemented, I have not tried this yet. A separate security solution would need to be implemented for local development. This is all a bit messy as all these extra steps end up in costs or developers taking short cuts and deploying with less security.