GATK ReadsPathDataSource类介绍

GATK(Genome Analysis Toolkit)是一个广泛使用的基因组分析工具包,它的核心库之一是htsjdk,用于处理高通量测序数据。在GATK中,ReadsPathDataSource类是负责管理和提供读取高通量测序数据文件(如BAM、SAM、CRAM)的类。

常见使用场景

  • 数据加载:在GATK的基因组分析工具链中,ReadsPathDataSource 经常被用来从指定路径加载测序数据。
  • 数据过滤:通过 ReadsPathDataSource,可以方便地在加载数据的同时进行预过滤,如按特定标准选择感兴趣的序列记录。
  • 多文件支持:支持同时从多个文件中加载数据,使得分析多个样本的数据更加便捷。

类关系

ReadsPathDataSource源码

package org.broadinstitute.hellbender.engine;import com.google.common.annotations.VisibleForTesting;
import htsjdk.samtools.MergingSamRecordIterator;
import htsjdk.samtools.SAMException;
import htsjdk.samtools.SAMFileHeader;
import htsjdk.samtools.SAMRecord;
import htsjdk.samtools.SAMSequenceDictionary;
import htsjdk.samtools.SamFileHeaderMerger;
import htsjdk.samtools.SamInputResource;
import htsjdk.samtools.SamReader;
import htsjdk.samtools.SamReaderFactory;
import htsjdk.samtools.util.CloseableIterator;
import htsjdk.samtools.util.IOUtil;
import org.apache.logging.log4j.LogManager;
import org.apache.logging.log4j.Logger;
import org.broadinstitute.hellbender.exceptions.GATKException;
import org.broadinstitute.hellbender.exceptions.UserException;
import org.broadinstitute.hellbender.utils.IntervalUtils;
import org.broadinstitute.hellbender.utils.SimpleInterval;
import org.broadinstitute.hellbender.utils.Utils;
import org.broadinstitute.hellbender.utils.gcs.BucketUtils;
import org.broadinstitute.hellbender.utils.iterators.SAMRecordToReadIterator;
import org.broadinstitute.hellbender.utils.iterators.SamReaderQueryingIterator;
import org.broadinstitute.hellbender.utils.read.GATKRead;
import org.broadinstitute.hellbender.utils.read.ReadConstants;import java.io.IOException;
import java.nio.channels.SeekableByteChannel;
import java.nio.file.Path;
import java.util.ArrayList;
import java.util.Arrays;
import java.util.Iterator;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.Set;
import java.util.function.Function;
import java.util.stream.Collectors;/*** Manages traversals and queries over sources of reads which are accessible via {@link Path}s* (for now, SAM/BAM/CRAM files only).** Two basic operations are available:** -Iteration over all reads, optionally restricted to reads that overlap a set of intervals* -Targeted queries by one interval at a time*/
public final class ReadsPathDataSource implements ReadsDataSource {private static final Logger logger = LogManager.getLogger(ReadsPathDataSource.class);/*** Mapping from SamReaders to iterators over the reads from each reader. Only one* iterator can be open from a given reader at a time (this is a restriction* in htsjdk). Iterator is set to null for a reader if no iteration is currently* active on that reader.*/private final Map<SamReader, CloseableIterator<SAMRecord>> readers;/*** Hang onto the input files so that we can print useful errors about them*/private final Map<SamReader, Path> backingPaths;/*** Only reads that overlap these intervals (and unmapped reads, if {@link #traverseUnmapped} is set) will be returned* during a full iteration. Null if iteration is unbounded.** Individual queries are unaffected by these intervals -- only traversals initiated via {@link #iterator} are affected.*/private List<SimpleInterval> intervalsForTraversal;/*** If true, restrict traversals to unmapped reads (and reads overlapping any {@link #intervalsForTraversal}, if set).* False if iteration is unbounded or bounded only by our {@link #intervalsForTraversal}.** Note that this setting covers only unmapped reads that have no position -- unmapped reads that are assigned the* position of their mates will be returned by queries overlapping that position.** Individual queries are unaffected by this setting  -- only traversals initiated via {@link #iterator} are affected.*/private boolean traverseUnmapped;/*** Used to create a merged Sam header when we're dealing with multiple readers. Null if we only have a single reader.*/private final SamFileHeaderMerger headerMerger;/*** Are indices available for all files?*/private boolean indicesAvailable;/*** Has it been closed already.*/private boolean isClosed;/*** Initialize this data source with a single SAM/BAM file and validation stringency SILENT.** @param samFile SAM/BAM file, not null.*/public ReadsPathDataSource( final Path samFile ) {this(samFile != null ? Arrays.asList(samFile) : null, (SamReaderFactory)null);}/*** Initialize this data source with multiple SAM/BAM files and validation stringency SILENT.** @param samFiles SAM/BAM files, not null.*/public ReadsPathDataSource( final List<Path> samFiles ) {this(samFiles, (SamReaderFactory)null);}/*** Initialize this data source with a single SAM/BAM file and a custom SamReaderFactory** @param samPath path to SAM/BAM file, not null.* @param customSamReaderFactory SamReaderFactory to use, if null a default factory with no reference and validation*                               stringency SILENT is used.*/public ReadsPathDataSource( final Path samPath, SamReaderFactory customSamReaderFactory ) {this(samPath != null ? Arrays.asList(samPath) : null, customSamReaderFactory);}/*** Initialize this data source with multiple SAM/BAM files and a custom SamReaderFactory** @param samPaths path to SAM/BAM file, not null.* @param customSamReaderFactory SamReaderFactory to use, if null a default factory with no reference and validation*                               stringency SILENT is used.*/public ReadsPathDataSource( final List<Path> samPaths, SamReaderFactory customSamReaderFactory ) {this(samPaths, null, customSamReaderFactory, 0, 0);}/*** Initialize this data source with multiple SAM/BAM/CRAM files, and explicit indices for those files.** @param samPaths paths to SAM/BAM/CRAM files, not null* @param samIndices indices for all of the SAM/BAM/CRAM files, in the same order as samPaths. May be null,*                   in which case index paths are inferred automatically.*/public ReadsPathDataSource( final List<Path> samPaths, final List<Path> samIndices ) {this(samPaths, samIndices, null, 0, 0);}/*** Initialize this data source with multiple SAM/BAM/CRAM files, explicit indices for those files,* and a custom SamReaderFactory.** @param samPaths paths to SAM/BAM/CRAM files, not null* @param samIndices indices for all of the SAM/BAM/CRAM files, in the same order as samPaths. May be null,*                   in which case index paths are inferred automatically.* @param customSamReaderFactory SamReaderFactory to use, if null a default factory with no reference and validation*                               stringency SILENT is used.*/public ReadsPathDataSource( final List<Path> samPaths, final List<Path> samIndices,SamReaderFactory customSamReaderFactory ) {this(samPaths, samIndices, customSamReaderFactory, 0, 0);}/*** Initialize this data source with multiple SAM/BAM/CRAM files, explicit indices for those files,* and a custom SamReaderFactory.** @param samPaths paths to SAM/BAM/CRAM files, not null* @param samIndices indices for all of the SAM/BAM/CRAM files, in the same order as samPaths. May be null,*                   in which case index paths are inferred automatically.* @param customSamReaderFactory SamReaderFactory to use, if null a default factory with no reference and validation*                               stringency SILENT is used.* @param cloudPrefetchBuffer MB size of caching/prefetching wrapper for the data, if on Google Cloud (0 to disable).* @param cloudIndexPrefetchBuffer MB size of caching/prefetching wrapper for the index, if on Google Cloud (0 to disable).*/public ReadsPathDataSource( final List<Path> samPaths, final List<Path> samIndices,SamReaderFactory customSamReaderFactory,int cloudPrefetchBuffer, int cloudIndexPrefetchBuffer) {this(samPaths, samIndices, customSamReaderFactory,BucketUtils.getPrefetchingWrapper(cloudPrefetchBuffer),BucketUtils.getPrefetchingWrapper(cloudIndexPrefetchBuffer) );}/*** Initialize this data source with multiple SAM/BAM/CRAM files, explicit indices for those files,* and a custom SamReaderFactory.** @param samPaths paths to SAM/BAM/CRAM files, not null* @param samIndices indices for all of the SAM/BAM/CRAM files, in the same order as samPaths. May be null,*                   in which case index paths are inferred automatically.* @param customSamReaderFactory SamReaderFactory to use, if null a default factory with no reference and validation*                               stringency SILENT is used.* @param cloudWrapper caching/prefetching wrapper for the data, if on Google Cloud.* @param cloudIndexWrapper caching/prefetching wrapper for the index, if on Google Cloud.*/public ReadsPathDataSource( final List<Path> samPaths, final List<Path> samIndices,SamReaderFactory customSamReaderFactory,Function<SeekableByteChannel, SeekableByteChannel> cloudWrapper,Function<SeekableByteChannel, SeekableByteChannel> cloudIndexWrapper ) {Utils.nonNull(samPaths);Utils.nonEmpty(samPaths, "ReadsPathDataSource cannot be created from empty file list");if ( samIndices != null && samPaths.size() != samIndices.size() ) {throw new UserException(String.format("Must have the same number of BAM/CRAM/SAM paths and indices. Saw %d BAM/CRAM/SAMs but %d indices",samPaths.size(), samIndices.size()));}readers = new LinkedHashMap<>(samPaths.size() * 2);backingPaths = new LinkedHashMap<>(samPaths.size() * 2);indicesAvailable = true;final SamReaderFactory samReaderFactory =customSamReaderFactory == null ?SamReaderFactory.makeDefault().validationStringency(ReadConstants.DEFAULT_READ_VALIDATION_STRINGENCY) :customSamReaderFactory;int samCount = 0;for ( final Path samPath : samPaths ) {// Ensure each file can be readtry {IOUtil.assertFileIsReadable(samPath);}catch ( SAMException|IllegalArgumentException e ) {throw new UserException.CouldNotReadInputFile(samPath.toString(), e);}Function<SeekableByteChannel, SeekableByteChannel> wrapper =(BucketUtils.isEligibleForPrefetching(samPath)? cloudWrapper: Function.identity());// if samIndices==null then we'll guess the index name from the file name.// If the file's on the cloud, then the search will only consider locations that are also// in the cloud.Function<SeekableByteChannel, SeekableByteChannel> indexWrapper =((samIndices != null && BucketUtils.isEligibleForPrefetching(samIndices.get(samCount))|| (samIndices == null && BucketUtils.isEligibleForPrefetching(samPath)))? cloudIndexWrapper: Function.identity());SamReader reader;if ( samIndices == null ) {reader = samReaderFactory.open(samPath, wrapper, indexWrapper);}else {final SamInputResource samResource = SamInputResource.of(samPath, wrapper);Path indexPath = samIndices.get(samCount);samResource.index(indexPath, indexWrapper);reader = samReaderFactory.open(samResource);}// Ensure that each file has an indexif ( ! reader.hasIndex() ) {indicesAvailable = false;}readers.put(reader, null);backingPaths.put(reader, samPath);++samCount;}// Prepare a header merger only if we have multiple readersheaderMerger = samPaths.size() > 1 ? createHeaderMerger() : null;}/*** Are indices available for all files?*/public boolean indicesAvailable() {return indicesAvailable;}/*** @return true if indices are available for all inputs.* This is identical to {@link #indicesAvailable}*/@Overridepublic boolean isQueryableByInterval() {return indicesAvailable();}/*** Restricts a traversal of this data source via {@link #iterator} to only return reads that overlap the given intervals,* and to unmapped reads if specified.** Calls to {@link #query} are not affected by this method.** @param intervals Our next full traversal will return reads overlapping these intervals* @param traverseUnmapped Our next full traversal will return unmapped reads (this affects only unmapped reads that*                         have no position -- unmapped reads that have the position of their mapped mates will be*                         included if the interval overlapping that position is included).*/@Overridepublic void setTraversalBounds( final List<SimpleInterval> intervals, final boolean traverseUnmapped ) {// Set intervalsForTraversal to null if intervals is either null or emptythis.intervalsForTraversal = intervals != null && ! intervals.isEmpty() ? intervals : null;this.traverseUnmapped = traverseUnmapped;if ( traversalIsBounded() && ! indicesAvailable ) {raiseExceptionForMissingIndex("Traversal by intervals was requested but some input files are not indexed.");}}/*** @return True if traversals initiated via {@link #iterator} will be restricted to reads that overlap intervals*         as configured via {@link #setTraversalBounds}, otherwise false*/@Overridepublic boolean traversalIsBounded() {return intervalsForTraversal != null || traverseUnmapped;}private void raiseExceptionForMissingIndex( String reason ) {String commandsToIndex = backingPaths.entrySet().stream().filter(f -> !f.getKey().hasIndex()).map(Map.Entry::getValue).map(Path::toAbsolutePath).map(f -> "samtools index " + f).collect(Collectors.joining("\n","\n","\n"));throw new UserException(reason + "\nPlease index all input files:\n" + commandsToIndex);}/*** Iterate over all reads in this data source. If intervals were provided via {@link #setTraversalBounds},* iteration is limited to reads that overlap that set of intervals.** @return An iterator over the reads in this data source, limited to reads that overlap the intervals supplied*         via {@link #setTraversalBounds} (if intervals were provided)*/@Overridepublic Iterator<GATKRead> iterator() {logger.debug("Preparing readers for traversal");return prepareIteratorsForTraversal(intervalsForTraversal, traverseUnmapped);}/*** Query reads over a specific interval. This operation is not affected by prior calls to* {@link #setTraversalBounds}** @param interval The interval over which to query* @return Iterator over reads overlapping the query interval*/@Overridepublic Iterator<GATKRead> query( final SimpleInterval interval ) {if ( ! indicesAvailable ) {raiseExceptionForMissingIndex("Cannot query reads data source by interval unless all files are indexed");}return prepareIteratorsForTraversal(Arrays.asList(interval));}/*** @return An iterator over just the unmapped reads with no assigned position. This operation is not affected*         by prior calls to {@link #setTraversalBounds}. The underlying file must be indexed.*/@Overridepublic Iterator<GATKRead> queryUnmapped() {if ( ! indicesAvailable ) {raiseExceptionForMissingIndex("Cannot query reads data source by interval unless all files are indexed");}return prepareIteratorsForTraversal(null, true);}/*** Returns the SAM header for this data source. Will be a merged header if there are multiple readers.* If there is only a single reader, returns its header directly.** @return SAM header for this data source*/@Overridepublic SAMFileHeader getHeader() {return headerMerger != null ? headerMerger.getMergedHeader() : readers.entrySet().iterator().next().getKey().getFileHeader();}/*** Prepare iterators over all readers in response to a request for a complete iteration or query** If there are multiple intervals, they must have been optimized using QueryInterval.optimizeIntervals()* before calling this method.** @param queryIntervals Intervals to bound the iteration (reads must overlap one of these intervals). If null, iteration is unbounded.* @return Iterator over all reads in this data source, limited to overlap with the supplied intervals*/private Iterator<GATKRead> prepareIteratorsForTraversal( final List<SimpleInterval> queryIntervals ) {return prepareIteratorsForTraversal(queryIntervals, false);}/*** Prepare iterators over all readers in response to a request for a complete iteration or query** @param queryIntervals Intervals to bound the iteration (reads must overlap one of these intervals). If null, iteration is unbounded.* @return Iterator over all reads in this data source, limited to overlap with the supplied intervals*/private Iterator<GATKRead> prepareIteratorsForTraversal( final List<SimpleInterval> queryIntervals, final boolean queryUnmapped ) {// htsjdk requires that only one iterator be open at a time per reader, so close out// any previous iterationsclosePreviousIterationsIfNecessary();final boolean traversalIsBounded = (queryIntervals != null && ! queryIntervals.isEmpty()) || queryUnmapped;// Set up an iterator for each reader, bounded to overlap with the supplied intervals if there are anyfor ( Map.Entry<SamReader, CloseableIterator<SAMRecord>> readerEntry : readers.entrySet() ) {if (traversalIsBounded) {readerEntry.setValue(new SamReaderQueryingIterator(readerEntry.getKey(),readers.size() > 1 ?getIntervalsOverlappingReader(readerEntry.getKey(), queryIntervals) :queryIntervals,queryUnmapped));} else {readerEntry.setValue(readerEntry.getKey().iterator());}}// Create a merging iterator over all readers if necessary. In the case where there's only a single reader,// return its iterator directly to avoid the overhead of the merging iterator.Iterator<SAMRecord> startingIterator = null;if ( readers.size() == 1 ) {startingIterator = readers.entrySet().iterator().next().getValue();}else {startingIterator = new MergingSamRecordIterator(headerMerger, readers, true);}return new SAMRecordToReadIterator(startingIterator);}/*** Reduce the intervals down to only include ones that can actually intersect with this reader*/private List<SimpleInterval> getIntervalsOverlappingReader(final SamReader samReader,final List<SimpleInterval> queryIntervals ){final SAMSequenceDictionary sequenceDictionary = samReader.getFileHeader().getSequenceDictionary();return queryIntervals.stream().filter(interval -> IntervalUtils.intervalIsOnDictionaryContig(interval, sequenceDictionary)).collect(Collectors.toList());}/*** Create a header merger from the individual SAM/BAM headers in our readers** @return a header merger containing all individual headers in this data source*/private SamFileHeaderMerger createHeaderMerger() {List<SAMFileHeader> headers = new ArrayList<>(readers.size());for ( Map.Entry<SamReader, CloseableIterator<SAMRecord>> readerEntry : readers.entrySet() ) {headers.add(readerEntry.getKey().getFileHeader());}SamFileHeaderMerger headerMerger = new SamFileHeaderMerger(identifySortOrder(headers), headers, true);return headerMerger;}@VisibleForTestingstatic SAMFileHeader.SortOrder identifySortOrder( final List<SAMFileHeader> headers ){final Set<SAMFileHeader.SortOrder> sortOrders = headers.stream().map(SAMFileHeader::getSortOrder).collect(Collectors.toSet());final SAMFileHeader.SortOrder order;if (sortOrders.size() == 1) {order = sortOrders.iterator().next();} else {order = SAMFileHeader.SortOrder.unsorted;logger.warn("Inputs have different sort orders. Assuming {} sorted reads for all of them.", order);}return order;}/*** @return true if this {@code ReadsPathDataSource} supports serial iteration (has only non-SAM inputs). If any* input has type==SAM_TYPE (is backed by a SamFileReader) this will return false, since SamFileReader* doesn't support serial iterators, and can't be serially re-traversed without re-initialization of the* underlying reader (and {@code ReadsPathDataSource}.*/public boolean supportsSerialIteration() {return !hasSAMInputs();}/*** Shut down this data source permanently, closing all iterations and readers.*/@Overridepublic void close() {if (isClosed) {return;}isClosed = true;closePreviousIterationsIfNecessary();try {for ( Map.Entry<SamReader, CloseableIterator<SAMRecord>> readerEntry : readers.entrySet() ) {readerEntry.getKey().close();}}catch ( IOException e ) {throw new GATKException("Error closing SAMReader");}}boolean isClosed() {return isClosed;}/*** Close any previously-opened iterations over our readers (htsjdk allows only one open iteration per reader).*/private void closePreviousIterationsIfNecessary() {for ( Map.Entry<SamReader, CloseableIterator<SAMRecord>> readerEntry : readers.entrySet() ) {CloseableIterator<SAMRecord> readerIterator = readerEntry.getValue();if ( readerIterator != null ) {readerIterator.close();readerEntry.setValue(null);}}}// Return true if any input is has type==SAM_TYPE (is backed by a SamFileReader) since SamFileReader// doesn't support serial iterators and can't be serially re-traversed without re-initialization of the// readerprivate boolean hasSAMInputs() {return readers.keySet().stream().anyMatch(r -> r.type().equals(SamReader.Type.SAM_TYPE));}/*** Get the sequence dictionary for this ReadsPathDataSource** @return SAMSequenceDictionary from the SAMReader backing this if there is only 1 input file, otherwise the merged SAMSequenceDictionary from the merged header*/@Overridepublic SAMSequenceDictionary getSequenceDictionary() {return getHeader().getSequenceDictionary();}}

ReadsDataSource源码

package org.broadinstitute.hellbender.engine;import htsjdk.samtools.SAMFileHeader;
import htsjdk.samtools.SAMSequenceDictionary;
import org.broadinstitute.hellbender.utils.SimpleInterval;
import org.broadinstitute.hellbender.utils.read.GATKRead;import java.util.Iterator;
import java.util.List;/**** An interface for managing traversals over sources of reads.** Two basic operations are available:** -Iteration over all reads, optionally restricted to reads that overlap a set of intervals* -Targeted queries by one interval at a time*/
public interface ReadsDataSource extends GATKDataSource<GATKRead>, AutoCloseable {/*** Restricts a traversal of this data source via {@link #iterator} to only return reads that overlap the given intervals,* and to unmapped reads if specified.** Calls to {@link #query} are not affected by this method.** @param intervals Our next full traversal will return reads overlapping these intervals* @param traverseUnmapped Our next full traversal will return unmapped reads (this affects only unmapped reads that*                         have no position -- unmapped reads that have the position of their mapped mates will be*                         included if the interval overlapping that position is included).*/void setTraversalBounds(List<SimpleInterval> intervals, boolean traverseUnmapped);/*** Restricts a traversal of this data source via {@link #iterator} to only return reads which overlap the given intervals.* Calls to {@link #query} are not affected by setting these intervals.** @param intervals Our next full traversal will return only reads overlapping these intervals*/default void setTraversalBounds(List<SimpleInterval> intervals) {setTraversalBounds(intervals, false);}/*** Restricts a traversal of this data source via {@link #iterator} to only return reads that overlap the given intervals,* and to unmapped reads if specified.** Calls to {@link #query} are not affected by this method.** @param traversalParameters set of traversal parameters to control which reads get returned by the next call*                            to {@link #iterator}*/default void setTraversalBounds(TraversalParameters traversalParameters){setTraversalBounds(traversalParameters.getIntervalsForTraversal(), traversalParameters.traverseUnmappedReads());}/*** @return true if traversals initiated via {@link #iterator} will be restricted to reads that overlap intervals*         as configured via {@link #setTraversalBounds}, otherwise false*/boolean traversalIsBounded();/*** @return true if this datasource supports the query() operation otherwise false.*/boolean isQueryableByInterval();/*** @return An iterator over just the unmapped reads with no assigned position. This operation is not affected*         by prior calls to {@link #setTraversalBounds}. The underlying file must be indexed.*/Iterator<GATKRead> queryUnmapped();/*** Returns the SAM header for this data source.** @return SAM header for this data source*/SAMFileHeader getHeader();/*** Get the sequence dictionary for this ReadsDataSource** @return SAMSequenceDictionary for the reads backing this datasource.*/default SAMSequenceDictionary getSequenceDictionary(){return getHeader().getSequenceDictionary();}/*** @return true if this {@code ReadsDataSource} supports multiple iterations over the data*/boolean supportsSerialIteration();/*** Shut down this data source permanently, closing all iterations and readers.*/@Override  //Overriden here to disallow throwing checked exceptions.void close();
}

GATKDataSource源码

package org.broadinstitute.hellbender.engine;import org.broadinstitute.hellbender.utils.SimpleInterval;import java.util.Iterator;/*** A GATKDataSource is something that can be iterated over from start to finish* and/or queried by genomic interval. It is not necessarily file-based.** @param <T> Type of data in the data source*/
public interface GATKDataSource<T> extends Iterable<T> {Iterator<T> query(final SimpleInterval interval);
}

本文来自互联网用户投稿,该文观点仅代表作者本人,不代表本站立场。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如若转载,请注明出处:http://www.rhkb.cn/news/408840.html

如若内容造成侵权/违法违规/事实不符,请联系长河编程网进行投诉反馈email:809451989@qq.com,一经查实,立即删除!

相关文章

【UE】尝试一种老派的平面假反射做法,与进一步改进效果的思路

在实践中&#xff0c;常常需要为类似荧幕&#xff0c;LED广告牌等平面制作反射。 但会遇到各种问题&#xff0c;例如在使用屏幕空间反射时&#xff0c;平面必须在画面内 平面反射捕获与光线追踪又代价高昂 因此&#xff0c;在一些情况下依然会使用一种历史悠久的反射手法 这种…

Windows SDK(九)登录框和计算器练习

这节课我们分别开始讲解登录框和计算机的实现 登录框实现 我们以上节课所学&#xff0c;自行创建一个对话框&#xff0c;ID为IDD_DIALOG1并将他编辑为一个登录框的样式。其中我们将账户的编辑框ID设置为IDC_ENIT_USERNAME&#xff0c;密码的编辑框ID设置为IDC_ENIT_PASSWORD。…

Pytorch构建网络模型结构都有哪些方式

目录 前言 1.使用nn.Module基类 2.使用nn.Sequential容器 3. 使用nn.ModuleList 4. 使用nn.ModuleDict 5. 混合使用nn.Module和原生Python代码 6.表格总结 前言 nn.Module&#xff1a;最通用、最灵活的方式&#xff0c;适用于几乎所有场景。nn.Sequential&#xff1a;适…

【HTML】为网页添加表单(控件)

1、表单 表单控件&#xff1a;包含了具体的表单功能项&#xff0c;如单行文本输入框、密码输入框、复选框、提交按钮、重置按钮等。 提示信息&#xff1a;一个表单中通常需要包含一些说明性的文字&#xff0c;提示用户进行填写和操作。 表单域&#xff1a;相当于一个容器&…

改造小蚁摄像头支持免费无限容量云储存(Samba挂载篇)

为什么要改造&#xff1f; 插卡摄像头最大的一个问题就是频繁的读写会导致内存卡寿命急速下降&#xff0c;哪怕是市面上支持NAS转存的摄像头也是先录制到SD卡里&#xff0c;然后把SD卡上的视频再转存到NAS。同样对内存卡和NAS硬盘寿命都是损耗巨大。而这类监控视频绝大多数情况…

深入理解Elasticsearch:让搜索性能飞起来!

Elasticsearch 概述 Elasticsearch是一个基于lucene、分布式、通过Restful方式进行交互的近实时搜索平台框架。 ELK 技术栈是Elasticsearch、Logstash、Kibana三大开元框架首字母大写简称。 而Elasticsearch 是一个开源的高扩展的分布式全文搜索引擎&#xff0c; 是整个 ELK技术…

ue5远程渲染和本地渲染的区别,及云渲染的联系

UE5这款引擎以其令人惊叹的渲染能力&#xff0c;为游戏开发者们打开了一扇通往视觉盛宴的大门。但是在UE5的世界里&#xff0c;渲染技术其实还有着本地渲染和远程渲染之分&#xff0c;而且它们与时下大热的云渲染技术也有着千丝万缕的联系。本文主要说明UE5中的远程渲染和本地渲…

Flask+LayUI开发手记(四):弹出层实现增删改查功能

在上一节用dataTable实现数据列表时&#xff0c;已经加了表头工具栏和表内工具栏&#xff0c;栏内的按钮功能都是用来完成数据的增删改查了&#xff0c;这又分成两类功能&#xff0c;一类是删除或设置&#xff0c;这类功能简单&#xff0c;只需要选定记录&#xff0c;然后提交到…

golang RSA 解密前端jsencrypt发送的数据时异常 crypto/rsa: decryption error 解决方法

golang中 RSA解密前端&#xff08;jsencrypt&#xff09;发来的密文后出现 "crypto/rsa: decryption error" &#xff0c; 这个问题首先需要确认你的私匙和公匙是否匹配&#xff0c; 如果匹配 那检查入参数据类型&#xff0c; 前端发送来的rsa加密后的数据一般都是…

【算法进阶2-动态规划】斐波那契数列(递归调用、动态规划)、钢条切割问题(自定而下实现、自底向上、切割方案)

1 斐波那契数 2 钢条切割问题 2.1 最优解情况 2.2 钢条切割问题之自定而下实现 2.3 钢条切割问题之自底向上实现 2.4 钢条切割问题-重构解-切割方案 1 斐波那契数 # 1 子问题的重复计算 def fibonacci(n: int) -> int:"""使用递归方式计算第 n 个斐波那契数…

初识C语言指针(4)

目录 1. 字符指针变量 2. 数组指针变量 3. ⼆维数组传参的本质 4. 函数指针变量 5. typedef 关键字 6. 函数指针数组 结语 1. 字符指针变量 字符指针变量就是存储字符或字符串首字符地址的变量&#xff0c;字符指针变量有2种使用方式。 最常用的使用方式&#xff1a…

Datawhale X 李宏毅苹果书 AI夏令营(深度学习入门)task3

实践方法论 在应用机器学习算法时&#xff0c;实践方法论能够帮助我们更好地训练模型。如果在 Kaggle 上的结果不太好&#xff0c;虽然 Kaggle 上呈现的是测试数据的结果&#xff0c;但要先检查训练数据的损失。看看模型在训练数据上面&#xff0c;有没有学起来&#xff0c;再…

智能手机摄影综评:品牌联名与自建影像品牌的战略分析

随着智能手机摄影技术的飞速发展&#xff0c;各大厂商不仅与知名摄影品牌展开合作&#xff0c;还通过自建影像品牌来提升产品的摄影能力和品牌形象。本文将重点分析小米、华为、荣耀、OPPO、Vivo和苹果在摄影品牌联名与自建影像品牌方面的战略&#xff0c;探讨这些策略如何影响…

【案例55】WebSphere非root用户启动方案

问题背景 很多项目为了安全因素考虑&#xff0c;想让在Linux服务器中启动的程序都用非root用户启动。 解决方案 创建用户和组 现在我们用 root 用户登录&#xff0c;并创建用户和组。 ##创建用户 [rootnc-test ~]# useradd wasadmin##修改密码 [rootnc-test~]# passwd was…

Python优化算法16——鲸鱼优化算法(WOA)

科研里面优化算法都用的多&#xff0c;尤其是各种动物园里面的智能仿生优化算法&#xff0c;但是目前都是MATLAB的代码多&#xff0c;python几乎没有什么包&#xff0c;这次把优化算法系列的代码都从底层手写开始。 需要看以前的优化算法文章可以参考&#xff1a;Python优化算…

【学习笔记】技术分析-华为智驾控制器MDC Pro 610分析

华为的智能驾驶控制器一直在迭代&#xff0c;和网络上广泛披露的早期MDC 610相比&#xff0c;华为 MDC Pro 610 智能驾驶控制器&#xff0c;现在的样品设计采用了海思的双系统级芯片 (SoC) 提高了处理能力&#xff0c;三星的存储模块为无缝数据处理提供了充足的内存&#xff0c…

一分钟制作电子版的招生简章

​在当今信息化社会&#xff0c;快速、高效地传播信息显得尤为重要。招生简章作为学校、机构招生的重要宣传材料&#xff0c;其电子版制作更是需要简洁明了、吸引眼球。一分钟你就能制作出一份精美的电子版招生简章。让我们一起来看看&#xff0c;如何实现这一目标。 1.要制作电…

Linux 可视化管理工具:Webmin

&#x1f600;前言 在 Linux 系统的运维管理中&#xff0c;命令行界面&#xff08;CLI&#xff09;是主要的操作方式。然而&#xff0c;对于许多系统管理员或开发者来说&#xff0c;使用 CLI 进行管理和维护任务并不总是最直观或最方便的方式。为了简化操作并提高效率&#xff…

今天你City了吗?维乐Angel Revo带你穿梭都市自由随风~

当7月的热浪在都市中翻滚&#xff0c;你是否渴望逃离钢筋水泥的束缚&#xff0c;寻找一片属于自己的绿意盎然&#xff1f;今天你City了吗&#xff1f;快带上VELO Angel Revo一起抓住夏日的尾巴&#xff0c;用一场骑行与这座城市的风景共舞&#xff01;      轻巧出行&#…

面向对象编程:深入PHP的封装、继承和多态性!

文章目录 面向对象OOP的核心概念定义类、创建对象构造函数和析构函数访问修饰符继承方法重写接口和抽象类静态方法和属性魔术方法 错误处理错误处理概述错误级别异常处理自定义异常设置错误处理忽略错误错误日志断言 总结 面向对象编程&#xff08;OOP&#xff09;是一种编程范…