Our project includes water temperature data from a total of fifteen locations throughout the Beaver Creek watershed. In summer 2022 we established thirteen new sites to monitor water temperature with HOBO TempPro v2 loggers, in addition to the one site previously established by Kenai Watershed Forum. The USGS Alaska Science Center also established a real-time gauging station in the lower reach, which records water temperature in addition to discharge (station # 15266500). These data are available online at https://waterdata.usgs.gov/monitoring-location/15266500/.
Site locations and other metadata are available for download in the link below.
Prior to deployment, all water temperature loggers undergo a QA/QC check using ice-water and room temperature water as described in (Mauger et al. 2015).
1.1.2.2 Site Checks
Content TBD here.
1.1.2.3 Post-deployment
1.1.2.3.1 Fall 2022 Site Visits
We downloaded data from all loggers in September/October 2022, and reviewed it in Spring 2023. Each logger’s time series was visually inspected in an R Shiny plot for data that is non-representative of stream channel conditions, such as exposure to air or burial in sand.
Segments of each time series that were identified as non-representative were flagged in a separate csv file, then applied in order to remove these segments.
An example plot for one logger is shown below, with flagged data in red and retained data in black.
Records for time periods flagged for individual loggers are recorded and available to view at the download below.
All water temperature logger sites were visited in Fall 2023, and log files were downloaded from each logger. The QA/QC process described from Fall 2022 was applied in an identical fashion.
1.1.2.3.3 Fall 2023 Site Visits
Most water temperature logger sites were visited in Fall 2023, and log files were downloaded from each logger. The QA/QC process described from Fall 2022 was applied in an identical fashion.
Logger sites that were not downloaded in Fall 2023 include TL-1, TL-3, and TL-11. These sites will be visited by the end of October 2023 if weather conditions allow.
1.1.3 Download Preliminary Water Temperature Data
Water temperature will be applied as part of ongoing watershed-scale modeling efforts. At a later stage, it will be housed in a public repository such as AKTEMP (https://aktemp.uaa.alaska.edu) and others.
Preliminary water temperature files are available for download at the link below.
[NOTE: the long-term KWF logger site, “TL-12”, has a pair of twin loggers at one location. Data from 2019-2022 at this site is available for download at this hyperlink.]
Mauger, Sue, Rebecca Shaftel, E Jamie Trammell, Marcus Geist, and Dan Bogan. 2015. “Stream Temperature Data Collection Standards for Alaska: Minimum Standards to Generate Data Useful for Regional-Scale Analyses.”Journal of Hydrology: Regional Studies 4, Part B (September): 431–38. https://doi.org/10.1016/j.ejrh.2015.07.008.
# Methods---execute: echo: false freeze: autodate: "`r Sys.Date()`"format: html: code-fold: true code-tools: true code-summary: "Show the code"---```{r echo = F, message=F}# use same quarto settings as outlined in wqx repo (view code, link to github, download ms word or pdf, etc)# AKTEMP seperate templates## file## station# clear environmentrm(list=ls()) # clearing environmentgraphics.off() # clearing graphic devicecat("\014") # clearing console# load packageslibrary(tidyverse)library(readxl)library(magrittr)library(xfun)library(janitor)library(hms)library(plotly)library(qdapRegex)library(lubridate)library(leaflet)# use quarto render --cache-refresh in terminal if necessary```<br>## Water Temperature Loggers### LocationsOur project includes water temperature data from a total of fifteen locations throughout the Beaver Creek watershed. In summer 2022 we established thirteen new sites to monitor water temperature with HOBO TempPro v2 loggers, in addition to the one site previously established by Kenai Watershed Forum. The USGS Alaska Science Center also established a real-time gauging station in the lower reach, which records water temperature in addition to discharge (station \# 15266500). These data are available online at <https://waterdata.usgs.gov/monitoring-location/15266500/>.Site locations and other metadata are available for download in the link below.```{r echo = F, message=FALSE}# prepare temperature logger station metadatastation <-read_xlsx("other/input/temperature_logger_site_summary.xlsx") %>%clean_names() %>%rename(logger_serial = logger_serial_num_fieldsheet,latitude = latitude_dd,longitude = longitude_dd) %>%select(site_sheet, latitude,longitude, logger_serial, map_label, location_notes)# format station info so that contents fit in AKTEMP templatestation %<>%mutate("Code"= map_label,"Latitude"= latitude,"Longitude"= longitude,"Timezone"="US/Alaska (UTC-9 / UTC-8)","Description"= location_notes,"Waterbody Name"="Beaver Creek","Waterbody Type"="STREAM","Placement"="MAIN", # all loggers in this project placed in mainstem channel"Well-mixed"="TRUE","Active"="TRUE","Reference URL"="TBD","Private"="FALSE")# acquire and inspect all column names from AKTEMP templateaktemp_station_colnames <-read_excel("other/input/AKTEMP_templates/AKTEMP-stations-template.xlsx", sheet ="STATIONS") %>%colnames()# subset columns to those which are provided in the AKTEMP templatestation %<>%select(one_of(aktemp_station_colnames),"logger_serial") %>%transform(logger_serial =as.character(logger_serial)) %>%filter(!is.na(Code))# export csv to local repowrite.csv(station,"other/output/station.csv", row.names = F)``````{r, echo = F}# Download station metadataxfun::embed_file("other/output/station.csv", text ="Download Temperature Logger Site Metadata for USGS / KWF Beaver Creek Hydrology Project")```<br>An ArcGIS Online map of site locations is displayed below. The map may also be accessed at <https://arcg.is/0ySarv1>.<br>```{=html}<style>.embed-container {position: relative; padding-bottom: 80%; height: 0; max-width: 100%;} .embed-container iframe, .embed-container object, .embed-container iframe{position: absolute; top: 0; left: 0; width: 100%; height: 100%;} small{position: absolute; z-index: 40; bottom: 0; margin-bottom: -15px;}</style>```::: embed-container<small><ahref="//kwf.maps.arcgis.com/apps/Embed/index.html?webmap=97eeaaf4e3ce4ecfa7ee2228b0373c07&extent=-151.3279,60.5578,-150.8627,60.6559&home=true&zoom=true&scale=true&search=true&searchextent=true&basemap_gallery=true&disable_scroll=true&theme=light"style="color:#0000FF;text-align:left"target="_blank">View larger map</a></small><br><iframewidth="500"height="400"frameborder="0"scrolling="no"marginheight="0"marginwidth="0"title="Beaver Creek Groundwater and Stream Temperature"src="//kwf.maps.arcgis.com/apps/Embed/index.html?webmap=97eeaaf4e3ce4ecfa7ee2228b0373c07&extent=-151.3279,60.5578,-150.8627,60.6559&home=true&zoom=true&previewImage=false&scale=true&search=true&searchextent=true&basemap_gallery=true&disable_scroll=true&theme=light"></iframe>:::<br>### Water Temperature Logger QA/QC Checks#### Pre-deploymentPrior to deployment, all water temperature loggers undergo a QA/QC check using ice-water and room temperature water as described in [@mauger2015].#### Site ChecksContent TBD here.#### Post-deployment<br>##### Fall 2022 Site VisitsWe downloaded data from all loggers in September/October 2022, and reviewed it in Spring 2023. Each logger's time series was visually inspected in an R Shiny plot for data that is non-representative of stream channel conditions, such as exposure to air or burial in sand.Segments of each time series that were identified as non-representative were flagged in a separate csv file, then applied in order to remove these segments.An example plot for one logger is shown below, with flagged data in red and retained data in black.```{r, echo = F, cache = TRUE}# Prepare water temperature data for inspection# read in seperate csv to match site metadata with logger serial number# check out what format / procedure is need for data upload to AKTEMP; see if the above process matches the needs of this outcome## read in and prepare logger data# Note: we have logger data both from the KWF site and the nearby (~200 m upstream) UAA site## for "dir", choose location of all UNMODIFIED csv files downloaded from HOBOware# roll over the csv files in the "input" folder pulling in datadir <-"other/input/temperature_loggers/fall_2022/csv/"csvNames <-sort(list.files(dir, full.names =TRUE))allData <-list()colTypes <-cols("i", "c", "d", "?", "_", "_", "_")# run loopfor (i inseq_along(csvNames)) { tmpData <-read_csv(csvNames[i], col_types = colTypes, skip =1) %>%select(starts_with(c("Date","Temp")))colnames(tmpData) <-c("date_time","temp_C") tmpData %<>%transform(date_time =mdy_hms(date_time)) tmpData$logger_serial <-ex_between(csvNames[i], "csv/", ".csv") allData[[i]] <- tmpData}# merge all the datasets into one data frameallData <-do.call(rbind, allData) %>%distinct()``````{r echo = F}# general approach:# 1.) use plotly to visually observe each time series from an individual logger# 2.) create csv of time periods that need to be flagged for each logger# manual procedure# create ggplotly chart for each time series, one at a time, # set logger id# remove hashtags below one at a time to plot. Double-hashtag indicates that a visual inspection was completed##logger <-10816960##logger <-20012591##logger <-20625008##logger <-21235340##logger <-21235341##logger <-21235343##logger <-21444843##logger <-21444844##logger <-21444869##logger <-21444870##logger <-21444872##logger <-21444873logger <-21444874##logger <-21488145``````{r, echo = F, eval = F}# remove "eval = F" when we want to use this plot during qa/qc# plotggplotly( p <- allData %>%# modified site one at a time here to visually inspect datasetsfilter(logger_serial == logger) %>%ggplot(aes(date_time,temp_C)) +geom_point() +ggtitle(paste("Logger",logger, "pre-inspection")),# plot sizeheight =350, width =600 )``````{r echo = F}# read in file of visually identified flagged dataflagData <-read.csv("other/input/temperature_loggers/fall_2022/qa_qc/temp_logger_flagged_data.csv", sep =",") %>%select(-notes) %>%drop_na() %>%transform(date_time_start =mdy_hm(date_time_start),date_time_stop =mdy_hm(date_time_stop)) %>%transform(logger_serial =as.character(logger_serial))# mark flagged events in the full record# ------------------------------------------------------------------------------# all starts usableallData$useData <-1# then we roll over the flags identifying matching rows in the full recordflagPivs <-unique(unlist(apply(flagData, 1, function(v) {which(allData$logger_serial == v[1] & allData$date_time >=as_datetime(v[2]) & allData$date_time <=as_datetime(v[3]))})))# finally, we mark all useData values of bad events as 0allData$useData[flagPivs] <-0``````{r echo = F}# plot with flagged data in redggplotly( p <- allData %>%# modified site one at a time here to visually inspect datasetsfilter(logger_serial == logger) %>%mutate(status =case_when( useData ==1~"Keep", useData ==0~"Remove")) %>%ggplot(aes(date_time,temp_C, color = status)) +geom_point() +scale_colour_manual("Status",values=c("black","red")) +ggtitle(paste("Logger",logger, "with flagged data")),# plot sizeheight =350, width =600 )```Records for time periods flagged for individual loggers are recorded and available to view at the download below.```{r, echo = F}# Download flagged dataxfun::embed_file("other/input/temperature_loggers/fall_2022/qa_qc/temp_logger_flagged_data.csv", text ="Download Temperature Logger Flagged Data Records from Fall 2022 QA/QC")# rename existing allData dataframe as Fall 2022 specificallyallData_fall22 <- allDatarm(allData)```<br>##### Spring/Summer 2023 Site VisitsAll water temperature logger sites were visited in Fall 2023, and log files were downloaded from each logger. The QA/QC process described from Fall 2022 was applied in an identical fashion.```{r, echo = F, cache = TRUE}# Prepare water temperature data for inspection# read in seperate csv to match site metadata with logger serial number# check out what format / procedure is need for data upload to AKTEMP; see if the above process matches the needs of this outcome## read in and prepare logger data# Note: we have logger data both from the KWF site and the nearby (~200 m upstream) UAA site## choose location of all UNMODIFIED csv files downloaded from HOBOware# roll over the csv files in the "input" folder pulling in datadir <-"other/input/temperature_loggers/summer_2023/csv/"csvNames <-sort(list.files(dir, full.names =TRUE))allData <-list()colTypes <-cols("i", "c", "d", "?", "_", "_", "_")# run loopfor (i inseq_along(csvNames)) { tmpData <-read_csv(csvNames[i], col_types = colTypes, skip =1) %>%select(starts_with(c("Date","Temp")))colnames(tmpData) <-c("date_time","temp_C") tmpData %<>%transform(date_time =mdy_hms(date_time)) tmpData$logger_serial <-ex_between(csvNames[i], "csv/", ".csv") allData[[i]] <- tmpData}# merge all the datasets into one data frameallData <-do.call(rbind, allData) %>%distinct()``````{r echo = F}# temperature unit conversion# one logger download file (#10816960) for summer 2023 was downloaded as Farenheight. We will convert it to celsius.allData %<>%mutate(temp_C =case_when( logger_serial =="10816960"~ (temp_C -32)*(5/9),TRUE~ temp_C ))``````{r echo = F}# general approach:# 1.) use plotly to visually observe each time series from an individual logger# 2.) create csv of time periods that need to be flagged for each logger# manual procedure# create ggplotly chart for each time series, one at a time, # set logger id# remove hashtags below one at a time to plot. Double-hashtag indicates that a visual inspection was completed for the summer 2023 downloadlogger <-10816960#logger <-20012591 [This logger is located at the KWF Beaver Creek long term site. Data from this logger is not available after fall 2022, but data from the twin logger at this location is, #21488142]##logger <- 21488142##logger <-20625008##logger <-21235340##logger <-21235341##logger <-21235343##logger <-21444843##logger <-21444844##logger <-21444869##logger <-21444870##logger <-21444872##logger <-21444873##logger <-21444874##logger <-21488145# working here: next steps:## adjust script to incorporate event when a new logger replaces the old one at the same site## locate missing log file for site #15 ( by gravel pit)## decide on temp metrics outcome + literature``````{r, echo = F, eval = F}# remove "eval = F" when we want to use this plot during qa/qc# plotggplotly( p <- allData %>%# modified site one at a time here to visually inspect datasetsfilter(logger_serial == logger) %>%ggplot(aes(date_time,temp_C)) +geom_point() +ggtitle(paste("Logger",logger, "pre-inspection")),# plot sizeheight =350, width =600 )``````{r echo = F}# read in file of visually identified flagged dataflagData <-read.csv("other/input/temperature_loggers/summer_2023/qa_qc/temp_logger_flagged_data.csv", sep =",") %>%select(-notes) %>%drop_na() %>%transform(date_time_start =mdy_hm(date_time_start),date_time_stop =mdy_hm(date_time_stop)) %>%transform(logger_serial =as.character(logger_serial))# mark flagged events in the full record# ------------------------------------------------------------------------------# all starts usableallData$useData <-1# then we roll over the flags identifying matching rows in the full recordflagPivs <-unique(unlist(apply(flagData, 1, function(v) {which(allData$logger_serial == v[1] & allData$date_time >=as_datetime(v[2]) & allData$date_time <=as_datetime(v[3]))})))# finally, we mark all useData values of bad events as 0allData$useData[flagPivs] <-0``````{r echo = F, eval = F}# plot with flagged data in redggplotly( p <- allData %>%# modified site one at a time here to visually inspect datasetsfilter(logger_serial == logger) %>%mutate(status =case_when( useData ==1~"Keep", useData ==0~"Remove")) %>%ggplot(aes(date_time,temp_C, color = status)) +geom_point() +scale_colour_manual("Status",values=c("black","red")) +ggtitle(paste("Logger",logger, "with flagged data")),# plot sizeheight =350, width =600 )``````{r, echo = F}# rename existing allData file as Summer 2023 specificallyallData_summer23 <- allDatarm(allData)```<br>##### Fall 2023 Site VisitsMost water temperature logger sites were visited in Fall 2023, and log files were downloaded from each logger. The QA/QC process described from Fall 2022 was applied in an identical fashion.Logger sites that were not downloaded in Fall 2023 include TL-1, TL-3, and TL-11. These sites will be visited by the end of October 2023 if weather conditions allow.```{r, echo = F, cache = TRUE}# Prepare water temperature data for inspection# read in seperate csv to match site metadata with logger serial number# check out what format / procedure is need for data upload to AKTEMP; see if the above process matches the needs of this outcome## read in and prepare logger data## choose location of all UNMODIFIED csv files downloaded from HOBOware# roll over the csv files in the "input" folder pulling in datadir <-"other/input/temperature_loggers/fall_2023/csv/"csvNames <-sort(list.files(dir, full.names =TRUE))allData <-list()colTypes <-cols("i", "c", "d", "?", "_", "_", "_")# run loopfor (i inseq_along(csvNames)) { tmpData <-read_csv(csvNames[i], col_types = colTypes, skip =1) %>%select(starts_with(c("Date","Temp")))colnames(tmpData) <-c("date_time","temp_C") tmpData %<>%transform(date_time =mdy_hms(date_time)) tmpData$logger_serial <-ex_between(csvNames[i], "csv/", ".csv") allData[[i]] <- tmpData}# merge all the datasets into one data frameallData <-do.call(rbind, allData) %>%distinct()``````{r echo = F}# NOTE: upon inspection, it is clear that all of the Fall 2023 temperature loggers are in Farenheight instead of Celsius. Convert to celsius.# temperature unit conversionallData %<>%mutate(temp_C = (temp_C -31)*(5/9))``````{r echo = F}# general approach:# 1.) use plotly to visually observe each time series from an individual logger# 2.) create csv of time periods that need to be flagged for each logger# manual procedure# create ggplotly chart for each time series, one at a time, # set logger id# remove hashtags below one at a time to plot. Double-hashtag indicates that a visual inspection was completed for the Fall 2023 download##logger <-10816960##logger <-20012591 #[This logger is paired with a twin logger at the KWF Beaver Creek long term site] ##logger <- 21488142 #[This logger is paired with a twin logger at the KWF Beaver Creek long term site]##logger <-20625008 ##logger <-21235340##logger <-21235341##logger <-21235343#logger <-21444843 ## csv not present 10/11/2023. did not check site fall 2023##logger <-21444844logger <-21444869##logger <-21444870##logger <-21444872##logger <-21444873##logger <-21444874##logger <-21488145 ## csv not present 10/11/2023. did not check site fall 2023# working here: next steps:## adjust script to incorporate event when a new logger replaces the old one at the same site## locate missing log file for site #15 ( by gravel pit)## decide on temp metrics outcome + literature# need to combine/average the two loggers at the KWF long term site# check summer 2023 paper forms to confirm serial numbers ... make sure these correspond with metadata file``````{r, echo = F, eval = F}# remove "eval = F" when we want to use this plot during qa/qc# plotggplotly( p <- allData %>%# modified site one at a time here to visually inspect datasetsfilter(logger_serial == logger) %>%ggplot(aes(date_time,temp_C)) +geom_point() +ggtitle(paste("Logger",logger, "pre-inspection")),# plot sizeheight =350, width =600 )``````{r echo = F}# read in file of visually identified flagged dataflagData <-read.csv("other/input/temperature_loggers/fall_2023/qa_qc/temp_logger_flagged_data.csv", sep =",") %>%select(-notes) %>%drop_na() %>%transform(date_time_start =mdy_hm(date_time_start),date_time_stop =mdy_hm(date_time_stop)) %>%transform(logger_serial =as.character(logger_serial))# mark flagged events in the full record# ------------------------------------------------------------------------------# all starts usableallData$useData <-1# then we roll over the flags identifying matching rows in the full recordflagPivs <-unique(unlist(apply(flagData, 1, function(v) {which(allData$logger_serial == v[1] & allData$date_time >=as_datetime(v[2]) & allData$date_time <=as_datetime(v[3]))})))# finally, we mark all useData values of bad events as 0allData$useData[flagPivs] <-0``````{r echo = F, eval = F}# plot with flagged data in redggplotly( p <- allData %>%# modified site one at a time here to visually inspect datasetsfilter(logger_serial == logger) %>%mutate(status =case_when( useData ==1~"Keep", useData ==0~"Remove")) %>%ggplot(aes(date_time,temp_C, color = status)) +geom_point() +scale_colour_manual("Status",values=c("black","red")) +ggtitle(paste("Logger",logger, "with flagged data")),# plot sizeheight =350, width =600 )``````{r, echo = F}# rename existing allData file as Summer 2023 specificallyallData_fall23 <- allDatarm(allData)``````{r echo = F}# combining data from multiple seasons# joining dataframes from multiple seasons# join post-QA/QC dataframes from fall 2022, summer 2023, and fall 2023 downloadsallData <-bind_rows(allData_fall22,allData_summer23,allData_fall23) %>%# remove redundant datadistinct()# remove old dataframesrm(allData_fall22,allData_summer23,allData_fall23)```<br>### Download Preliminary Water Temperature DataWater temperature will be applied as part of ongoing watershed-scale modeling efforts. At a later stage, it will be housed in a public repository such as AKTEMP (<https://aktemp.uaa.alaska.edu>) and others.Preliminary water temperature files are available for download at the link below.\[NOTE: the long-term KWF logger site, "TL-12", has a pair of twin loggers at one location. Data from 2019-2022 at this site is [available for download at this hyperlink](https://github.com/Kenai-Watershed-Forum/kwf_temp_loggers/blob/main/other/output/beaver_creek_2019_2022).\]```{r echo = F, message = F}# exclude flagged data and NA valuesallData %<>%filter(useData ==1) %>%select(-useData) %>%transform(logger_serial =as.character(logger_serial))# join temperature data to site metadataallData <-left_join(allData,station)# prep format for AKTEMP database## --> not done here yet# set output directorytemp_output_dir <-"other/output/post_qa_data/"# remove old zip file and individual csv filesunlink(paste0(temp_output_dir,"*"))# export individual time series from each site# method adapted from from https://gist.github.com/jflanaga/1ab2fa1434064780d2237e73d9e669c4allData %>%group_by(logger_serial) %>%group_walk(~write_csv(.x, paste0(temp_output_dir,.y$logger_serial,".csv")))# zip multiple csv fileszip_files <-list.files(path = temp_output_dir, pattern =".csv$", full.names=TRUE)zipfile <-paste0(temp_output_dir,"beaver_creek_water_temps_",Sys.Date(),".zip")# create new zipfilezip::zipr(zipfile, files = zip_files)``````{r echo = F}# download zip filexfun::embed_file(zipfile, text ="Download Zip File of Post-QA/QC Water Temperature Logger Data")``````{r echo = F}# Parts of the above script was developed with assistance from Dr. Jose Bentancourt in April 2023: https://www.linkedin.com/in/djbetancourt/```